Date: Wed, 30 Apr 2025 11:53:54 +0200
Another thing worth considering is that whatever we design should be
future-proof for decimal integers. I'm pretty sure that decimal floats will
eventually be ported from C to C++, and if that happens, it makes no sense
for the user to first convert an ASCII representation to a sequence of
bits, since that sequence will have to be broken up into digits base 1024
anyway.
future-proof for decimal integers. I'm pretty sure that decimal floats will
eventually be ported from C to C++, and if that happens, it makes no sense
for the user to first convert an ASCII representation to a sequence of
bits, since that sequence will have to be broken up into digits base 1024
anyway.
Received on 2025-04-30 09:54:07