Hi Ville, On Wed, 29 Mar 2023 at 14:56, Timur Doumler via Std-Proposals<email@example.com> wrote:
Not exactly. Standard library vendors make such decisions becausethat's what their users predominantly want;they want ABI compatibility more or less at the expense of othertrade-offs (like small tweaks to maximal performance, and others).This is why the various big internet companies were so frustratedabout not getting their wishes about breakingABI early and often, and why the other side was so frustrated aboutthem not getting it. Recompiling the worldis a significant burden, both for users of closed-source libraries andalso users of open-source libraries, and alsofor OS vendors, and that burden increases the higher you go in thelibrary pile/stack/chain. Btw, this is also why there are proposalsthat lead to requiring less recompiling when switching betweencontract violation handling modes. It's all aboutthe same user wishes, and avoiding close-to-intolerable burdens atclose-to-intolerable frequencies.
On 29 Mar 2023, at 14:53, Jonathan Wakely <firstname.lastname@example.org> wrote:
Please stop framing opposition to ABI breaks to sound like a plot by evil vendors.
Wait, what? Where in my email above did I do any of that?
"Deemed unacceptable for the major compiler vendors" certainly sounds like it's just a decision that the vendors make
Hm, OK, so I have to admit, this is how I thought it works, yes.
Thanks. Just to avoid further confusion: I am perfectly aware of the tradeoffs of ABI compatibility. I spent a decade working on music production software, where it is very common to link against libraries where you only have a header + precompiled binaries, because the author of that library wrote a super fancy patented pitch shifting algorithm or whatnot and doesn't want to give you their source code. And then, whenever there is an ABI break you need to chase this guy to recompile his binary for you (once I had to wait three weeks for that binary because the guy was on vacation, didn't have any automated CI or anything, and that blocked an entire release). In this industry, not only is recompiling the world a significant burden, but it is outright impossible. ABI stability is thus a business-critical concern.
On the other end of the spectrum, there are companies like Google who are routinely recompiling the world, and have optimised their build system and infrastructure to do this. For them, prioritising ABI stability is irrelevant, and the tradeoffs (notably, sacrificing possible performance optimisations in the standard library in favour of ABI stability) work against their business interests.
My comment eluded to my current understanding that, given that it is impossible to satisfy both use cases, vendors have basically decided to prioritise the former over the latter. It seems that you agree this is what is happening? It also seems that you are saying this is because the majority of users fall in the former camp? Is this understanding correct?