Date: Sun, 16 Feb 2025 17:42:05 +0100
Thank you for your thoughts, Arthur.
I have posted a draft at https://isocpp.org/files/papers/D3639R0.html
which discusses some of your points now.
Considering that _BitInt is only guaranteed to give you 64 bits, I am
sceptical of the "C does it anyway" claim. If MSVC already supported
bit-precise integers with a very large width, I could get behind that,
but it doesn't. I've also received numerous concerns from people
regarding my 128-bit paper, and was suggested multiple times to make
it freestanding-optional.
If we want portability, then crutching on C's _BitInt is just not
going to do the job, at least not in the current environment. A
library type could easily have guaranteed freestanding support for a
million bits, considering that it's just an array with some operator
overloads, nothing that would require the compiler vendor to do much
work. That is, unless the vendor wants to do the work and back "class
bit_int" by a fundamental type.
I have posted a draft at https://isocpp.org/files/papers/D3639R0.html
which discusses some of your points now.
Considering that _BitInt is only guaranteed to give you 64 bits, I am
sceptical of the "C does it anyway" claim. If MSVC already supported
bit-precise integers with a very large width, I could get behind that,
but it doesn't. I've also received numerous concerns from people
regarding my 128-bit paper, and was suggested multiple times to make
it freestanding-optional.
If we want portability, then crutching on C's _BitInt is just not
going to do the job, at least not in the current environment. A
library type could easily have guaranteed freestanding support for a
million bits, considering that it's just an array with some operator
overloads, nothing that would require the compiler vendor to do much
work. That is, unless the vendor wants to do the work and back "class
bit_int" by a fundamental type.
Received on 2025-02-16 16:42:22