C++ Logo

std-proposals

Advanced search

Re: [std-proposals] promote T[] and deprecate delete[]

From: David Brown <david.brown_at_[hidden]>
Date: Wed, 2 Jul 2025 18:06:16 +0200
On 02/07/2025 17:03, Henning Meyer via Std-Proposals wrote:
> Hi David, thanks for your idea.
>
> What I want are C++ programs that are checked for correctness at build
> time instead of running into undefined behavior at run time.

I fully agree with that philosophy.

> So there will be a static analyzer (using the compiler front-end and
> middle-end) that checks for pointer misuse, it does that by keeping
> track of which allocations come from new[], where do they go, and where
> do they end up (free/delete/delete[]/leaked).

That's not going to be feasible, at least not in all circumstances. You
can detect some kinds of error at compile (or lint) time, and some kinds
at run-time with low overhead, some kinds at run-time with high
overhead, and some you will not catch at all.

For catching things at compile time (the best outcome), you want to use
strong typing. Use types and allocation functions that make mistakes
harder to make and easier to catch at compile-time. In this case,
simply don't use "new T[n]" in user code - restrict it to the
implementation of wrapper classes and functions where you can easily see
your deallocations are correct. That's why we have smart pointers, and
various containers.

Sometimes tools provide additional features that can do more static
analysis, for things that cannot reasonably be added to the language.
For example, gcc has the "malloc" attribute that lets you pair an
allocating function with a deallocating function. It is primarily for C
- in C++, you would typically use a class with a constructor and
destructor to enforce the policy. But it might give you some idea of
what can be done by analysers, without changing the language.

<https://gcc.gnu.org/onlinedocs/gcc/Common-Function-Attributes.html#index-functions-that-behave-like-malloc>

(Note that this can handle more than just memory allocation - as can C++
RAII classes.)

See also
<https://gcc.gnu.org/onlinedocs/gcc/Static-Analyzer-Options.html> and
the "-fsanitize" options.

> You cannot reason about correctness of C++ programs without tracking
> pointer provenance.

Tracking provenance is likely to need some overhead somewhere. That
means a wrapper class.

> The distinction between memory allocated by new and new[] really is a
> different type, and we have a very rich type system, that happens to
> have a blind spot when it comes to this difference. And everyone deals
> with that by layering complexity onto complexity.
>

That is the only way to deal with it.

Different design decisions could have been made in the language at the
start, but they would all have had their disadvantages. Ultimately, if
you want to be able to destroy an array of objects, you need to know how
many objects there are - somewhere, that information needs to be stored.
  And you don't want the overhead of storing that information, or
checking for it, in the more common case of destroying a single object.
So the source code has to give the compiler that information -
explicitly with delete[], or implicitly through wrapper classes.

> Your suggestion represents the state of the art, which is everyone
> creates a custom wrapper that hides the ugly bits and creates custom
> guidelines to not use language features and use custom library types
> instead.
>

It turns out my suggestion was already covered by std::unique_ptr<T[]>.
(I imagine that my idea could have lower overheads, but that's possibly
just because I haven't thought through the details for more complex
cases. And std::unique_ptr<T[]> has the big advantage of already
existing - a few extra bytes overhead is unlikely to be significant in
real code.)

You are right that making wrappers to hide the ugly bits is state of the
art - it has always been state of the art for programming, in all
languages. Maybe it would be possible to make a nicer, safer or more
convenient wrapper here, but a wrapper is the solution. You only make
changes to the core language when wrappers or functions can't do the job
well enough in terms of programmer convenience, safety, efficiency, etc.

> I am fully aware that every change to basic types will break millions of
> lines of code, because there are billions of lines of C++ out there.
> There are also hundreds of little changes that would improve the core
> language, not just this one.
>

C-style arrays, and their arguably strange treatment in the language,
are fundamental. There is no way you are going to be able to change the
behaviour for deleting them without massive changes to the language and
existing code.

Yes, there are hundreds of little changes to the core that would improve
C++. Most of these cannot be done. And for the others - well, every
C++ programmer would come up with a different list of what to change.
And again, this is the same in all languages.

If there were to be one change to C++ core that I think would improve
the language, it would be the introduction of a "safe/unsafe"
distinction (or alternatively, "application/library", or something like
that). Safe code could not create or use raw pointers, C-style arrays,
C library string functions, or many other such parts of the language.
It would have to use wrapper classes and functions - the implementation
of which would, of course, be "unsafe" and have access to the low-level
features. That would also solve your original problem - if normal
"safe" code can't do "new T[n];", it can't get the deletion wrong!


> If I want this change I probably have to go down the route of circle and
> cpp2 and fork the language (I won't, too much work).
>

A new language created somewhat from scratch would probably handle
arrays in a different manner from C and C++.

Received on 2025-07-02 16:06:23