Date: Tue, 8 Feb 2022 18:07:11 -0500
On Tue, Feb 8, 2022 at 5:02 PM Olga Arkhipova via SG15
<sg15_at_[hidden]> wrote:
> 1.1 The build needs to have Module B’s and Module A’s BMIs to be able to compile the source using it.
> The build needs to either
> 1.1.1. find BMIs (assuming it can read them)
> or
> 1.1.2. somehow build the BMIs (assuming it cannot read them).
Correct. The "compatibility surface" for BMI files is radically
smaller than the "compatibility surface" of the language ABI and
runtime. It's very likely that a library that was originally built
with gcc will then be consumed by a codebase being compiled in clang.
Or, even when using the same compiler, they will very likely be using
different versions. None of those share the same BMI.
Having to create build rules for Module B and Module A is the
fundamental change introduced by modules.
> 1.2. The build also needs to find and link static lib of Library B (which contains implementation of Module B), as well as static lib of Library A (which contains implementation of Module A).
Correct. However, this is not something that changes by the
introduction of modules. Build and package management systems that
already link libraries together today will be able to continue doing
what they are doing.
Modules don't change how that works.
> Tools outside of the main build (IDE, static analysis, etc.) need to compile/parse C.
> Assumption: the main build of C succeeds on the same machine.
> For that they need to be able to rebuild Module A and Module B (the assumption is that they never can read BMIs).
Correct. In the world pre-modules, it is possible to perform static
analysis in a translation unit by observing the compiler command for
that translation unit (usually encoded in compile_commands.json). With
the introduction of modules, the static analysis pass needs to somehow
be able to reproduce the same additional steps that the build system
did.
> Does this sound right and are there any other scenarios/assumptions?
The main thing missing here is that when libraries are distributed as
prebuilt artifacts (using the various package management systems, such
as dpkg, rpm, vcpkg, conan, etc) in the end they need to be
representable as files on disk, those files need to be installable on
a single system in a way that a build system that is not aware of the
package manager can consume them (e.g.: CMake doesn't know how boost
got to the system, it just knows how to use the files that were made
available).
But yes, I am moderately confident that most other scenarios are
indirectly covered by this scenario.
daniel
<sg15_at_[hidden]> wrote:
> 1.1 The build needs to have Module B’s and Module A’s BMIs to be able to compile the source using it.
> The build needs to either
> 1.1.1. find BMIs (assuming it can read them)
> or
> 1.1.2. somehow build the BMIs (assuming it cannot read them).
Correct. The "compatibility surface" for BMI files is radically
smaller than the "compatibility surface" of the language ABI and
runtime. It's very likely that a library that was originally built
with gcc will then be consumed by a codebase being compiled in clang.
Or, even when using the same compiler, they will very likely be using
different versions. None of those share the same BMI.
Having to create build rules for Module B and Module A is the
fundamental change introduced by modules.
> 1.2. The build also needs to find and link static lib of Library B (which contains implementation of Module B), as well as static lib of Library A (which contains implementation of Module A).
Correct. However, this is not something that changes by the
introduction of modules. Build and package management systems that
already link libraries together today will be able to continue doing
what they are doing.
Modules don't change how that works.
> Tools outside of the main build (IDE, static analysis, etc.) need to compile/parse C.
> Assumption: the main build of C succeeds on the same machine.
> For that they need to be able to rebuild Module A and Module B (the assumption is that they never can read BMIs).
Correct. In the world pre-modules, it is possible to perform static
analysis in a translation unit by observing the compiler command for
that translation unit (usually encoded in compile_commands.json). With
the introduction of modules, the static analysis pass needs to somehow
be able to reproduce the same additional steps that the build system
did.
> Does this sound right and are there any other scenarios/assumptions?
The main thing missing here is that when libraries are distributed as
prebuilt artifacts (using the various package management systems, such
as dpkg, rpm, vcpkg, conan, etc) in the end they need to be
representable as files on disk, those files need to be installable on
a single system in a way that a build system that is not aware of the
package manager can consume them (e.g.: CMake doesn't know how boost
got to the system, it just knows how to use the files that were made
available).
But yes, I am moderately confident that most other scenarios are
indirectly covered by this scenario.
daniel
Received on 2022-02-08 23:07:23