C++ Logo


Advanced search

Re: [wg14/wg21 liaison] Semantics of angelic nondeterminism

From: Jens Maurer <jens.maurer_at_[hidden]>
Date: Sat, 13 Apr 2024 11:12:12 +0200
On 13/04/2024 10.03, Martin Uecker via Liaison wrote:
> Am Freitag, dem 12.04.2024 um 17:14 -0500 schrieb Davis Herring:

>> I see neither how that inutility follows from a notion of local reasoning nor what available specification strategy avoids such reliance. Consider the ill-advised C program
>> int main() {
>> int a[2]={0};
>> return a[flip()]; // flip from the previous example
>> }
>> The possible observable behaviors of this program are that it returns 0 (having accessed a[1]) or has undefined behavior (having attempted to access a[2]). Of course, that set of
>> observable behaviors is identical to that of a program that unconditionally exhibits undefined behavior, since C says that "this document imposes no requirements" and so "the program
>> returns 0" does not extend the set. C lacks the explicit admonition on this subject that in C++ is [intro.abstract]/5:
>> However, if any such execution contains an undefined operation, this document places no requirement on the implementation executing that program with that input (not even with regard to
>> operations preceding the first undefined operation).
> Indeed, C does not have this. We want to be able to
> reason about executions where flip() returns 1 and we also
> want to be able to reason about executions where flip() returns
> 2 until the point the operation that has UB is encountered.

Not everybody seems to share that view of the status quo of C:


"If any step in a program’s execution has undefined behavior,
then the entire execution is without meaning. This is important:
it’s not that evaluating (1<<32) has an unpredictable result,
but rather that the entire execution of a program that evaluates
this expression is meaningless. Also, it’s not that the execution
is meaningful up to the point where undefined behavior happens:
the bad effects can actually precede the undefined operation."

It seems what is explicitly specified in C++ ("not even with
regard to operations preceding the first undefined operation")
is believed to be implied by some readers of the C standard.
Thus, if you want to make C have the "until the UB point"
smeantics, some changes to C are desirable.

Interestingly, neither gcc nor clang seem to purge the null pointer
check in this case, for either C or C++:

int x = 1;

void f(int *p)
  if(p) x = 0;
  *p = 42;

Although that would seem to be one of the easier cases
where implementations could exploit UB back-propagation.

Are there some specific examples where that happens?

>> This omission seems consistent with my understanding that WG14 and WG21 have not always agreed about the permission for an implementation to optimize a program under the assumption that
>> undefined behavior never occurs.
> I do not understand why WG21 goes down the path of essentially
> producing a specification that gives "no semantics" to almost
> any real world program, while seemingly providing (almost)
> no advantages in terms of optimization because compilers
> can not reason globally about all paths of executions anyway.

You seem to be assuming that "almost any real world program"
encounters undefined behavior somewhere during a particular
execution. Let's assume that's true. I'm not seeing why
the resulting potential exploitation of the undefined behavior
by the optimizer after the undefined behavior situation is
categorically less concerning than any potential back-propagation.
In the real world, calls to unknown functions and syscalls
are quite sturdy optimization barriers, but it's hard to
rely on that in the specification because of LTO and other
similar whole-program approaches.


Received on 2024-04-13 09:12:20