Date: Thu, 30 Mar 2023 14:52:43 +0300
Hi Ville,
> On 29 Mar 2023, at 16:51, Ville Voutilainen <ville.voutilainen_at_[hidden]> wrote:
>
> On Wed, 29 Mar 2023 at 14:56, Timur Doumler via Std-Proposals
> <std-proposals_at_[hidden] <mailto:std-proposals_at_[hidden]>> wrote:
>>
>>
>>
>> On 29 Mar 2023, at 14:53, Jonathan Wakely <cxx_at_[hidden]> wrote:
>>>
>>> Please stop framing opposition to ABI breaks to sound like a plot by evil vendors.
>>>
>>>
>>> Wait, what? Where in my email above did I do any of that?
>>
>>
>> "Deemed unacceptable for the major compiler vendors" certainly sounds like it's just a decision that the vendors make
>>
>>
>> Hm, OK, so I have to admit, this is how I thought it works, yes.
>
> Not exactly. Standard library vendors make such decisions because
> that's what their users predominantly want;
> they want ABI compatibility more or less at the expense of other
> trade-offs (like small tweaks to maximal performance, and others).
> This is why the various big internet companies were so frustrated
> about not getting their wishes about breaking
> ABI early and often, and why the other side was so frustrated about
> them not getting it. Recompiling the world
> is a significant burden, both for users of closed-source libraries and
> also users of open-source libraries, and also
> for OS vendors, and that burden increases the higher you go in the
> library pile/stack/chain. Btw, this is also why there are proposals
> that lead to requiring less recompiling when switching between
> contract violation handling modes. It's all about
> the same user wishes, and avoiding close-to-intolerable burdens at
> close-to-intolerable frequencies.
Thanks. Just to avoid further confusion: I am perfectly aware of the tradeoffs of ABI compatibility. I spent a decade working on music production software, where it is very common to link against libraries where you only have a header + precompiled binaries, because the author of that library wrote a super fancy patented pitch shifting algorithm or whatnot and doesn't want to give you their source code. And then, whenever there is an ABI break you need to chase this guy to recompile his binary for you (once I had to wait three weeks for that binary because the guy was on vacation, didn't have any automated CI or anything, and that blocked an entire release). In this industry, not only is recompiling the world a significant burden, but it is outright impossible. ABI stability is thus a business-critical concern.
On the other end of the spectrum, there are companies like Google who are routinely recompiling the world, and have optimised their build system and infrastructure to do this. For them, prioritising ABI stability is irrelevant, and the tradeoffs (notably, sacrificing possible performance optimisations in the standard library in favour of ABI stability) work against their business interests.
My comment eluded to my current understanding that, given that it is impossible to satisfy both use cases, vendors have basically decided to prioritise the former over the latter. It seems that you agree this is what is happening? It also seems that you are saying this is because the majority of users fall in the former camp? Is this understanding correct?
Cheers,
Timur
> On 29 Mar 2023, at 16:51, Ville Voutilainen <ville.voutilainen_at_[hidden]> wrote:
>
> On Wed, 29 Mar 2023 at 14:56, Timur Doumler via Std-Proposals
> <std-proposals_at_[hidden] <mailto:std-proposals_at_[hidden]>> wrote:
>>
>>
>>
>> On 29 Mar 2023, at 14:53, Jonathan Wakely <cxx_at_[hidden]> wrote:
>>>
>>> Please stop framing opposition to ABI breaks to sound like a plot by evil vendors.
>>>
>>>
>>> Wait, what? Where in my email above did I do any of that?
>>
>>
>> "Deemed unacceptable for the major compiler vendors" certainly sounds like it's just a decision that the vendors make
>>
>>
>> Hm, OK, so I have to admit, this is how I thought it works, yes.
>
> Not exactly. Standard library vendors make such decisions because
> that's what their users predominantly want;
> they want ABI compatibility more or less at the expense of other
> trade-offs (like small tweaks to maximal performance, and others).
> This is why the various big internet companies were so frustrated
> about not getting their wishes about breaking
> ABI early and often, and why the other side was so frustrated about
> them not getting it. Recompiling the world
> is a significant burden, both for users of closed-source libraries and
> also users of open-source libraries, and also
> for OS vendors, and that burden increases the higher you go in the
> library pile/stack/chain. Btw, this is also why there are proposals
> that lead to requiring less recompiling when switching between
> contract violation handling modes. It's all about
> the same user wishes, and avoiding close-to-intolerable burdens at
> close-to-intolerable frequencies.
Thanks. Just to avoid further confusion: I am perfectly aware of the tradeoffs of ABI compatibility. I spent a decade working on music production software, where it is very common to link against libraries where you only have a header + precompiled binaries, because the author of that library wrote a super fancy patented pitch shifting algorithm or whatnot and doesn't want to give you their source code. And then, whenever there is an ABI break you need to chase this guy to recompile his binary for you (once I had to wait three weeks for that binary because the guy was on vacation, didn't have any automated CI or anything, and that blocked an entire release). In this industry, not only is recompiling the world a significant burden, but it is outright impossible. ABI stability is thus a business-critical concern.
On the other end of the spectrum, there are companies like Google who are routinely recompiling the world, and have optimised their build system and infrastructure to do this. For them, prioritising ABI stability is irrelevant, and the tradeoffs (notably, sacrificing possible performance optimisations in the standard library in favour of ABI stability) work against their business interests.
My comment eluded to my current understanding that, given that it is impossible to satisfy both use cases, vendors have basically decided to prioritise the former over the latter. It seems that you agree this is what is happening? It also seems that you are saying this is because the majority of users fall in the former camp? Is this understanding correct?
Cheers,
Timur
Received on 2023-03-30 11:52:47