C++ Logo


Advanced search

Re: [std-proposals] Slow bulky integer types (128-bit)

From: Marcin Jaczewski <marcinjaczewski86_at_[hidden]>
Date: Thu, 30 Mar 2023 12:27:23 +0200
czw., 30 mar 2023 o 11:26 Alejandro Colomar via Std-Proposals
<std-proposals_at_[hidden]> napisaƂ(a):
> On 3/30/23 10:58, David Brown via Std-Proposals wrote:
> > On 29/03/2023 17:18, Arthur O'Dwyer via Std-Proposals wrote:
> >
> >> A new `uintmax_extended_t` (or whatever) can communicate properly from
> >> the get-go: "Hey! This type will change in the future! Don't build it
> >> into your APIs!"
> >> But then, if you aren't using this type in APIs, then where /*are*/ you
> >> using it, and why does it need to exist in the standard library at all?
> >>
> > That, I think, is the key point - /why/ would you want a "maximum size
> > integer type" ?
> For example to printf(3) an off_t variable.
> Or to write a [[gnu::always_inline]] function that accepts any integer.
> Or to write a type-generic macro that handles any integer.
> The addition of functions that handle [u]intmax_t was the design mistake.
> If they had been added as macros, we wouldn't be discussing ABI issues,
> because macros don't have ABI. Of course, the problem is that _Generic(3)
> was only added in C11, but intmax_t was added in C99, so they had to do it
> as functions. History sucks.

No, this would change nothing, if it was macro, you still could put it
on some API and then worry about ABI break.
Usage of this causes risk of ABI break, not how it was defined.

Received on 2023-03-30 10:27:37