C++ Logo

sg12

Advanced search

Re: [ub] Objectives and tasks for SG12

From: Lawrence Crowl <crowl_at_[hidden]>
Date: Thu, 30 May 2013 17:40:45 -0700
On 5/30/13, Jens Maurer <Jens.Maurer_at_[hidden]> wrote:
> On 05/29/2013 10:36 PM, Nevin Liber wrote:
> > On 29 May 2013 14:35, Jens Maurer <Jens.Maurer_at_[hidden]> wrote:
> > > (1) Is a compiler diagnostic acceptable? Yes.
> > > (2) Is a run-time abort acceptable? Yes.
> > > (3) Is an unspecified result value acceptable? Yes.
> > > (4) Is it acceptable that your compiler changes the behavior
> > > of unrelated code that follows the overflow? That's very
> > > surprising.
> > >
> > > Giving compilers latitude to choose among 1-3 (depending on
> > > the target audience) is fine, but, in my opinion, prohibiting
> > > option 4 would be an improvement.
> >
> > The counter argument is usually that (4) has a run time cost
> > in that the overflow must now be detected instead of just
> > assumed that it cannot happen. This effectively penalizes
> > correct programs.
>
> This statement seems to be inaccurate for the majority of
> current hardware. Signed integer overflow will just "work" on
> the hardware level and give you some result, i.e. implement (3).
> No extra checking is needed.
>
> This is exactly the reason why I think "signed integer overflow" is
> a good example for the discussion: Current hardware exhibits only
> a limited set of behavior, yet the C++ standard does not reflect
> that, but gives permission to the compiler to do anything it wants.
>
> Can we quantify what we give up if we model current hardware
> behavior more closely?

Consider a program that adds a constant to a signed integer
in a loop. Under the current model, the compiler can assume
that the variable is monotonically increasing, and can therefore
eliminate comparisons, which leads to simpler loops, which leads
to vectorization, which leads to implementation in GPUs, ....
Without that assumption, the chain of optimizations disappears.
We really won't know the true tradeoff on real code until such
optimizations are widely deployed.

> > What does "behavior of unrelated code" even mean once we've
> > invoked undefined behavior?
>
> Well, I thought the goal of SG12 was to discuss whether the
> current definition of "undefined behavior" should be retained
> for some (which?) cases, or whether something could be done to
> restrain the set of valid executions. For example, we already
> have the concept of "unspecified behavior", e.g. the sequencing
> of evaluation of function arguments is unspecified. This means
> the implementation is restricted to choose among a set of possible
> behaviors, and not exhibit arbitrary behavior.

My hope for SG12 was to get a clear understanding of the risks
and benefits of undefined behavior. At present, the C++ community
seem to be filled with fear of the risks, with no understanding of
the benefits. The risks and benefits may vary between different
language features, so I expect the present state to change.

I think we would be wise to consider unspecified behavior as well.
The issue is that with unspecified behavior, programmers have the
same vulnerability to bugs, but the compiler has far less lattitude
to help find them.

One good result from SG12 is a list of features that have more than
one possible outcome, so that programmers have a better understanding
of when their programs stray from perfectly portable.

-- 
Lawrence Crowl

Received on 2013-05-31 02:40:46