Date: Sun, 16 Feb 2025 00:26:19 +0100
The more I think about it, the more I want to step away from my
std::int_least128_t paper and propose std::bit_int as a class template
instead. It does give you 128-bit computation as well, and it does
solve the problem more generally.
I don't think we should aim for a fundamental type here. It has a
massive blast radius to add this to the language, and the precedent is
to expose C features as library types when possible (std::atomic,
std::complex).
Furthermore, there's no portable way to call a C function that takes
_BitInt(N) from C++, which is very, very bad. Solving this problem is
motivation enough and I'm fairly sure that a std::bit_int library type
could make it into C++29, but I have no idea whether the
std::int_least128_t proposal would be thrown out.
Anyhow, I'm a bit bewildered why we don't have this thing already.
N1744 proposed this, but it's too old and I don't know of any polls or
minutes on this.
std::int_least128_t paper and propose std::bit_int as a class template
instead. It does give you 128-bit computation as well, and it does
solve the problem more generally.
I don't think we should aim for a fundamental type here. It has a
massive blast radius to add this to the language, and the precedent is
to expose C features as library types when possible (std::atomic,
std::complex).
Furthermore, there's no portable way to call a C function that takes
_BitInt(N) from C++, which is very, very bad. Solving this problem is
motivation enough and I'm fairly sure that a std::bit_int library type
could make it into C++29, but I have no idea whether the
std::int_least128_t proposal would be thrown out.
Anyhow, I'm a bit bewildered why we don't have this thing already.
N1744 proposed this, but it's too old and I don't know of any polls or
minutes on this.
Received on 2025-02-15 23:26:34