C++ Logo

sg12

Advanced search

Re: [ub] Aliasing char16_t with int_least16_t, etc.

From: Lawrence Crowl <Lawrence_at_[hidden]>
Date: Thu, 31 Oct 2013 16:18:36 -0700
On 10/31/13, Jean-Marc Bourguet <jm_at_[hidden]> wrote:
> On 31/10/2013 01:28, Lawrence Crowl wrote:
>> On 10/30/13, Jean-Marc Bourguet <jm_at_[hidden]> wrote:
>>> Does 3.10/10 covers the puning between uint_leastXX_t and
>>> int_leastXX_t?
>> I believe so, because those are just typedefs and they are
>> required to point to the same size.
>
> Are you sure? I though it was a requirement on value bits and
> thus the typedef could be on the next size.
>
> A machine similar to the unysis described by Ion which has to
> ignore the sign type in unsigned instead of making it contribute
> to the value could have a 16 bits type whose unsigned correspondent
> has a max of 2^15-1. If I'm not mistaken, it would have to be
> used as int_least16_t but the unsigned can't be used as
> uint_least16_t.
>
> (That type has to be an extended integer as USHRT_MAX has to be at
> least 65535 and unsigned char can't have padding bits).

Reading the C standard, there does not seem to be any requirement
that uint* typedefs have the same size as the corresponding int*
typedefs. However, the unsigned X types must be the same size as
the signed X types to match the aliasing rules. So, I suspect that
if uint_least16_t were not the same size as int_least16_t there would
be some program breakage.

-- 
Lawrence Crowl

Received on 2013-11-01 00:18:38