C++ Logo


Advanced search

Re: [std-proposals] Slow bulky integer types (128-bit)

From: Giuseppe D'Angelo <giuseppe.dangelo_at_[hidden]>
Date: Wed, 29 Mar 2023 11:50:09 +0200
Il 29/03/23 11:15, Timur Doumler via Std-Proposals ha scritto:
> Do I understand it correctly that the primary motivation for this change
> in C23 and C++23 was that changing uintmax_t would be an ABI break,
> which is deemed unacceptable for the major compiler vendors?

Changing uintmax_t would be an ABI break. The change in C++23 is to
actually allow implementations to actually offer __int128 as an extended
integer type, while keeping e.g. a 64 bit intmax_t. Otherwise, they
couldn't do that: the moment __int128 exists as an integer type,
intmax_t needs to widen, and break ABI.

It still sounds like a terrible idea to me. What's the point of intmax_t
if it's not capable of faithfully representing values of *all* integer
types? I'd rather see intmax_t deprecated than breaking completely
reasonable code:

> void f(std::signed_integral auto x)
> {
> std::intmax_t v{x}; // this should *always* be well-formed and never lose data
> }

> Because if the ABI issue didn't exist, it would indeed seem
> *conceptually* more correct to have __uint128_t be an integer type and
> uintmax_t to be 128 bit, would it not?

Right, but this is now a question for compiler vendors -- why do they
not consider __int128 an extended integer type at all (at least, in
strict mode)?

E.g. GCC https://gcc.gnu.org/onlinedocs/gcc/Integers-implementation.html

Was it just for the intmax incompatibility?

Giuseppe D'Angelo

Received on 2023-03-29 09:50:13