Date: Mon, 09 Jun 2025 14:42:38 -0700
> On Jun 9, 2025, at 2:04 PM, Tiago Freire <tmiguelf_at_[hidden]> wrote:
>
> I think we are putting the cart in front of the horses here.
>
> ABI breaks are an important consideration but not the ultimate consideration.
In understand that - I was trying very hard in the latter part especially to say that the important thing is that you can’t simply say “I don’t care” and ignore it.
> If we can have a feature without breaking ABI, that is better, no need to go trough all that effort if you can avoid it.
Right, but part of that decision process involves the proposal discussing the ABI impact which Avi seems to be uninterested in doing.
> And if the standard makes a change that breaks ABI, guess what? Next day, absolutely nothing happens, code will not automatically be broken. Things that used to build before will still build and work in the same way the day after.
> It takes time and effort to move a code base to a new standard and it is something that occurs intentionally.
> Your code that compiles on C++20 will still compile on C++20.
This is actually a major difference with other languages - plenty of languages don’t have the same strict abi requirements (managed languages, etc where the actual memory layout of objects is not directly accessible), but have stricter requirements about being able to compile the same sources in newer versions. I worked on the EcmaScript standard for many many years (through the major “the spec should be usable and accurate enough to ensure that implementing the spec will result in an interoperable engine” part from 3 - 5.1) - obviously in TC39 we didn’t have to worry about ABI, object representation, etc but JS doesn’t fundamentally have versions (there’s "strict mode” which is a huge burden on its own but has major benefits for every one from devs to implementations so it was worth it, if only just). The result of that is JS has strict source compatibility - code written 30 years ago has to work essentially identically today. This has fun consequences:
Context dependent keywords:
for (var of of of = [1, 2, 3]) {}
“of” is alas my fault, I didn’t push hard enough against the opposition to ‘:’ (because people at the time were hell bent on type annotations using ‘:’ which never happened but the proposal functionally burned the use of the token) or “for each”/“foreach” (created "confusion” with another ECMA standard, E4X, which was deprecated/killed a couple of years later).
My favorite though is
var let = 2; // declare a variable named "let"
let x = let * 2; // declare a variable named “x”, but let also exists as a variable
(It turns out “let” is a common abbreviation for “letter” in variable names O_o)
My point (for Avi) is that all standards have constraints on the nature of change they can make, and as you run up to those constraints you do have to address them.
> The problem is going to be for projects that want to migrate to a new standard because of feature X, but they have to deal with unwanted feature Y.
> It’s not a “oh no, this is a deal breaker, we have no way to handle this”… C++ is a language for practical applications, the problem is practical, it costs money to upgrade.
> Is your feature worth the impact of possibly millions of dollars that it will cost to upgrade an ABI? If not, probably its best not to do it.
> If it’s really worth it, maybe we can package that work overtime and do it anyway, but it must be really worth it (it costs a lot of people a lot of money).
Right, everything is trade offs, and the point I was trying to make to Avi is that you have to acknowledge that the ABI constraints exist, and explain why the concerns are not warranted, or the cost is justified.
> Come up with the feature first, and then we will see if it’s worth it.
I’m not actually sure what the concerns with their proposal was, and I can’t find a link to the proposal writeup (presumably it was in an earlier email that has since been deleted), but I was mostly irked at their dismissive response to those concerns.
—Oliver
>
> I think we are putting the cart in front of the horses here.
>
> ABI breaks are an important consideration but not the ultimate consideration.
In understand that - I was trying very hard in the latter part especially to say that the important thing is that you can’t simply say “I don’t care” and ignore it.
> If we can have a feature without breaking ABI, that is better, no need to go trough all that effort if you can avoid it.
Right, but part of that decision process involves the proposal discussing the ABI impact which Avi seems to be uninterested in doing.
> And if the standard makes a change that breaks ABI, guess what? Next day, absolutely nothing happens, code will not automatically be broken. Things that used to build before will still build and work in the same way the day after.
> It takes time and effort to move a code base to a new standard and it is something that occurs intentionally.
> Your code that compiles on C++20 will still compile on C++20.
This is actually a major difference with other languages - plenty of languages don’t have the same strict abi requirements (managed languages, etc where the actual memory layout of objects is not directly accessible), but have stricter requirements about being able to compile the same sources in newer versions. I worked on the EcmaScript standard for many many years (through the major “the spec should be usable and accurate enough to ensure that implementing the spec will result in an interoperable engine” part from 3 - 5.1) - obviously in TC39 we didn’t have to worry about ABI, object representation, etc but JS doesn’t fundamentally have versions (there’s "strict mode” which is a huge burden on its own but has major benefits for every one from devs to implementations so it was worth it, if only just). The result of that is JS has strict source compatibility - code written 30 years ago has to work essentially identically today. This has fun consequences:
Context dependent keywords:
for (var of of of = [1, 2, 3]) {}
“of” is alas my fault, I didn’t push hard enough against the opposition to ‘:’ (because people at the time were hell bent on type annotations using ‘:’ which never happened but the proposal functionally burned the use of the token) or “for each”/“foreach” (created "confusion” with another ECMA standard, E4X, which was deprecated/killed a couple of years later).
My favorite though is
var let = 2; // declare a variable named "let"
let x = let * 2; // declare a variable named “x”, but let also exists as a variable
(It turns out “let” is a common abbreviation for “letter” in variable names O_o)
My point (for Avi) is that all standards have constraints on the nature of change they can make, and as you run up to those constraints you do have to address them.
> The problem is going to be for projects that want to migrate to a new standard because of feature X, but they have to deal with unwanted feature Y.
> It’s not a “oh no, this is a deal breaker, we have no way to handle this”… C++ is a language for practical applications, the problem is practical, it costs money to upgrade.
> Is your feature worth the impact of possibly millions of dollars that it will cost to upgrade an ABI? If not, probably its best not to do it.
> If it’s really worth it, maybe we can package that work overtime and do it anyway, but it must be really worth it (it costs a lot of people a lot of money).
Right, everything is trade offs, and the point I was trying to make to Avi is that you have to acknowledge that the ABI constraints exist, and explain why the concerns are not warranted, or the cost is justified.
> Come up with the feature first, and then we will see if it’s worth it.
I’m not actually sure what the concerns with their proposal was, and I can’t find a link to the proposal writeup (presumably it was in an earlier email that has since been deleted), but I was mostly irked at their dismissive response to those concerns.
—Oliver
Received on 2025-06-09 21:42:56