Date: Sun, 11 Oct 2020 21:27:29 +0200
Dear Peter (CC all),
>Within the domain it's addressing, it's doing exactly what Freek suggests:
>there is some abstract state, recording provenance info, which "takes
>precedence" in the sense that that is what is used for provenance UB
>checks, then one falls back to the concrete model when eg reading
>a pointer value that has been manipulated bytewise.
My point is that in your model the abstract state is just
something supplementary on top of a memory consisting
of bytes. That's clumsy, because it will get *very*
complicated (effective types, provenance, wobbliness,
and you probably will have to add other stuff later).
I think it's much cleaner to have an "abstract memory"
as a first class thing (and then maybe have the "memory
consisting of bytes" as the thing that is supplementary,
although having both on an equal footing appeals to me more.)
So if I have an pointer to the end of a subobject next to
a pointer to the start of the next adjacent subobject, in
my abstract memory it's immediately clear which is which.
With the "provenance" approach as you propose it, it's not,
I think.
When you have unions it gets even worse. Unions are where
the hard problems are, the rest is much easier.
When you poke bytes into a union, which variant will you get?
And what can change the variant of a union? (In Robbert's
CH2O model, he had annotations of pointers for that.)
Does the provenance approach say something about that?
In the TS for the upcoming meeting I just see something about
type punning, but I'm talking about pointers pointing *into*
the union object, not subobjects *of* the union object.
I think the problems here become much more apparent when
you take a "first class" abstract memory model as the basis.
To argue that what I propose is really different (I think):
if you go in a *very* deep recursion with on every level
a local variable of which you take the pointer, your model
(which has bytes as the primitive notion) will eventually
run out of object representations, right? Mine won't,
because unless you start poking at bytes or intptr_t
representations of those pointers, there *are* no bytes,
it's all just abstract values. My proposed model really
can have arbitrarily deep recursion, even when a finite
size pointer type will get arbitrarily many different values.
So I'm not sure I buy the claim that what you propose will
in the end be the same as what I would like.
Freek
>Within the domain it's addressing, it's doing exactly what Freek suggests:
>there is some abstract state, recording provenance info, which "takes
>precedence" in the sense that that is what is used for provenance UB
>checks, then one falls back to the concrete model when eg reading
>a pointer value that has been manipulated bytewise.
My point is that in your model the abstract state is just
something supplementary on top of a memory consisting
of bytes. That's clumsy, because it will get *very*
complicated (effective types, provenance, wobbliness,
and you probably will have to add other stuff later).
I think it's much cleaner to have an "abstract memory"
as a first class thing (and then maybe have the "memory
consisting of bytes" as the thing that is supplementary,
although having both on an equal footing appeals to me more.)
So if I have an pointer to the end of a subobject next to
a pointer to the start of the next adjacent subobject, in
my abstract memory it's immediately clear which is which.
With the "provenance" approach as you propose it, it's not,
I think.
When you have unions it gets even worse. Unions are where
the hard problems are, the rest is much easier.
When you poke bytes into a union, which variant will you get?
And what can change the variant of a union? (In Robbert's
CH2O model, he had annotations of pointers for that.)
Does the provenance approach say something about that?
In the TS for the upcoming meeting I just see something about
type punning, but I'm talking about pointers pointing *into*
the union object, not subobjects *of* the union object.
I think the problems here become much more apparent when
you take a "first class" abstract memory model as the basis.
To argue that what I propose is really different (I think):
if you go in a *very* deep recursion with on every level
a local variable of which you take the pointer, your model
(which has bytes as the primitive notion) will eventually
run out of object representations, right? Mine won't,
because unless you start poking at bytes or intptr_t
representations of those pointers, there *are* no bytes,
it's all just abstract values. My proposed model really
can have arbitrarily deep recursion, even when a finite
size pointer type will get arbitrarily many different values.
So I'm not sure I buy the claim that what you propose will
in the end be the same as what I would like.
Freek
Received on 2020-10-11 14:27:37