On Wed, May 13, 2020 at 7:03 AM yo mizu via Std-Discussion <std-discussion@lists.isocpp.org> wrote:
According to P1839R2, the following code has undefined behavior as per
[expr.pre] p4.

int a = 420;
char b = *reinterpret_cast<char*>(&a);

P1839R2 states, "When the lvalue-to-rvalue conversion is applied to
the initializer expression of b, the behavior is undefined as per
[expr.pre] p4 because the result of such a conversion would be the
value of the int object (420), which is not a value representable by

So is the behavior of the other example below defined?

int c = 4;
char d = *reinterpret_cast<char*>(&c);

The result of the lvalue-to-rvalue conversion would be the value of
the int object (4).
It is not an object of char, but the value of 4 can be represented by char.

If this behavior is defined, and the value of "d" is 4, I think it is
at odds with the behavior of many existing compilers for big endian

And how about the following example.

struct E {} e {};
char f = *reinterpret_cast<char*>(&e);

Is the behavior of this example defined?
If this is undefined behavior, what description in the standard is it based on?

The issue really stems from big endian platforms... in memory (starting at 100d)

100  - 00  00 01 A4

the address of the 'char' that is in that integer is +3 from the start of the pointer.
This means that casting a int* to a char* may cause a shift to the pointer value; and worse, the conversion back from char* to int* may not know how many chars to unwind to get back to the start?  (or maybe, going to a short* inbetween?)

It's not so painful in a little endian world

100  - A4 01 00 00

100 is all the same for char*, short*, int* ,... 
Thank you and best regards,
Yo Mizu
Std-Discussion mailing list