Date: Sun, 24 Aug 2025 16:11:05 +0200
> On Aug 24, 2025, at 9:50 AM, Oliver Hunt <oliver_at_[hidden]> wrote:
>
>>
>> Furthermore, there was a recent discussion that showed (on Compiler Explorer) that even major compilers for certain pointer types produce the value -1 for nullptr. So, nullptr is not always 0, even on the compilers you are using.
>
> To simon: I’m willing to accept that this proposal said assigning nullptr to the pointer, so on those platforms that result would be assigning -1.
What I was trying to say is that not all nullptr have the same value on the exact same platform. I don’t perfectly recall the details. I believe it was on Windows that null pointers to member functions are -1 whereas all other null pointers are 0. C++ doesn’t even guarantee that with a single compiler for a single target all null pointers have the same value, but the value might depend on the type.
>
>
> To Simon: I am not sure this is correct, e.g take (making each step very explicit):
>
> void f(T*& ptr) {
> T* local = ptr;
> delete local;
> ptr = nullptr;
> }
>
> The object referenced by ptr has ended, the lifetime of the object pointed to by the reference has not (pointer to the storage of the pointer, not the object pointed to by the pointer)
Assigning nullptr to ptr is an unobservable side effect. It can be optimized away. If you have a variable inside your function that you only assign to but never read from, every assignment to that variable will be optimized away (with appropriate optimization settings).
Lifetime plays a big role in your example: the compiler can prove that the lifetime ends and there is no read from the variable. So, why assign it in the first place? It is the reason why it is so hard to write proper benchmarks. If you don’t use the result (or write compiler specific annotations) the whole computation might be thrown out because the compiler can see right through it. You’d be surprised how smart compilers actually are to prove this. It is also the reason why it is really hard to write security critical code: Have you ever tried to overwrite a password or key in memory? You need to use specialized function to really delete these from memory. Simply overwriting the memory area that had contained them will be optimized away.
>
> —Oliver
>
>> --
>> Std-Proposals mailing list
>> Std-Proposals_at_[hidden]
>> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
>
>>
>> Furthermore, there was a recent discussion that showed (on Compiler Explorer) that even major compilers for certain pointer types produce the value -1 for nullptr. So, nullptr is not always 0, even on the compilers you are using.
>
> To simon: I’m willing to accept that this proposal said assigning nullptr to the pointer, so on those platforms that result would be assigning -1.
What I was trying to say is that not all nullptr have the same value on the exact same platform. I don’t perfectly recall the details. I believe it was on Windows that null pointers to member functions are -1 whereas all other null pointers are 0. C++ doesn’t even guarantee that with a single compiler for a single target all null pointers have the same value, but the value might depend on the type.
>
>
> To Simon: I am not sure this is correct, e.g take (making each step very explicit):
>
> void f(T*& ptr) {
> T* local = ptr;
> delete local;
> ptr = nullptr;
> }
>
> The object referenced by ptr has ended, the lifetime of the object pointed to by the reference has not (pointer to the storage of the pointer, not the object pointed to by the pointer)
Assigning nullptr to ptr is an unobservable side effect. It can be optimized away. If you have a variable inside your function that you only assign to but never read from, every assignment to that variable will be optimized away (with appropriate optimization settings).
Lifetime plays a big role in your example: the compiler can prove that the lifetime ends and there is no read from the variable. So, why assign it in the first place? It is the reason why it is so hard to write proper benchmarks. If you don’t use the result (or write compiler specific annotations) the whole computation might be thrown out because the compiler can see right through it. You’d be surprised how smart compilers actually are to prove this. It is also the reason why it is really hard to write security critical code: Have you ever tried to overwrite a password or key in memory? You need to use specialized function to really delete these from memory. Simply overwriting the memory area that had contained them will be optimized away.
>
> —Oliver
>
>> --
>> Std-Proposals mailing list
>> Std-Proposals_at_[hidden]
>> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
Received on 2025-08-24 14:11:22