Subject: Re: reading values as bytes without memcpu, from enum unsigned char?
From: Thiago Macieira (thiago_at_[hidden])
Date: 2020-08-10 12:50:49
On Monday, 10 August 2020 10:29:05 PDT language.lawyer_at_[hidden] wrote:
> Who "it"? The Standard?
Thanks, I don't think I'd ever noticed that one.
> >>>>> The issue is that the value is not well-specified by the standard.
> >>>> The value is 100% specified by the standard. It is the value of `i`,
> >>>> which
> >>>> is `UCHAR_MAX + 1`.
> >>> If you meant that the value of i is the value of i, then it's
> >>> tautological.
> >> I meant that the result (value) of the lvalue-to-rvalue conversion
> >> applied
> >> to `*reinterpret_cast<unsigned char*>(&i)` (or
> >> `*reinterpret_cast<unsigned*>(&i)`) is the value of `i`. Kind of, because
> >> [expr.pre]/4 ([expr]/4) immediately tells that this is UB.
> > Neither is true. The values are currently implementation-defined
> I'd like to see a proof.
int i = UCHAR_MAX + 2;
unsigned char c = *reinterpret_cast<unsigned char *>(&i);
If you are correct, this should produce c = uchar(UCHAR_MAX + 2), which is
defined in the standard as (UCHAR_MAX + 2) modulo (UCHAR_MAX + 1), which
equals 1. But it doesn't, on any big-endian machine. For four examples and one
little-endian just to see that the compiler isn't "seeing through" the test,
-- Thiago Macieira - thiago (AT) macieira.info - thiago (AT) kde.org Software Architect - Intel DPG Cloud Engineering
STD-DISCUSSION list run by email@example.com
Older Archives on Google Groups