C++ Logo

sg15

Advanced search

Re: [Tooling] [ Modules and Tools ] Tracking Random Dependency Information

From: Manuel Klimek <klimek_at_[hidden]>
Date: Wed, 2 Jan 2019 14:40:56 +0100
On Wed, Dec 19, 2018 at 10:30 AM Colby Pike <vectorofbool_at_[hidden]> wrote:

> JeanHeyd:
>
> > Neither of these perform full semantic analysis to help get the build
> system going (they do preprocessing + a few other steps, but never semantic
> analysis), so I'm curious where I would find this information?
>
> Ninja parses the output from /showIncludes and -MMD, so it is using the
> preprocessor flags to do it, but that's not really the point: A full
> compilation is executed for the first pass before this information is
> available, and a byproduct of that compilation is the emission of the
> "effects-the-output" dependency information. Ninja doesn't have a separate
> "get the dependencies" stage that it runs before compilation. Even if there
> were an --emit-dep-info option to the compiler that performed full semantic
> analysis to get deep dependency information, this model would be fully
> supported.
>
> Boris:
>
> > It can't handle auto-generated headers.
>
> This is fully supported with order-only dependencies. This is not even a
> "pre-build" step, it purely effects the ordering when enqueueing edges for
> execution as part of the regular build.
>
> > It doesn't fit well with implementing support for distributed
> compilation or ignorable change detection.
>
> It works perfectly fine. For distributed build, you hand the TU to the
> distributed tool and it will spit back the dependency information. For
> ignorable changes, nothing about a just-in-time dependency info model
> requires timestamps be used as the determiner of "out-of-date"-ness.
>

If you want to sandbox execution so that an action is only executed with
the exact set of inputs it needs (for example, in order to send the minimum
number of inputs to a remote execution node), this seems like a
chicken-and-egg problem.


> > It won't work for C++ modules, unless you are prepared to go with the
> "compiler calls back into the build system" approach
>
> > I would be interested to learn how this will work
>
> I've been formulating what this will look like for a while. I have several
> ideas in mind, but going into them will send the thread pretty far off
> track (if it hasn't gone too far already).
>
> On Wed, Dec 19, 2018 at 1:58 AM Boris Kolpackov <boris_at_[hidden]>
> wrote:
>
>> Colby Pike <vectorofbool_at_[hidden]> writes:
>>
>> > Modern build tools such as Ninja grab the dependency information for a
>> > translation unit as part of invoking the compiler command, not by
>> running
>> > the preprocessor separately. I'm preparing a post on how Ninja builds
>> and
>> > performs dependency analysis, which I believe is the currently optimal
>> way
>> > to do it.
>>
>> This "header dependency information as a by-product of compilation"
>> approach has a number of problems:
>>
>> 1. It can't handle auto-generated headers.
>>
>> 2. It doesn't fit well with implementing support for distributed
>> compilation or ignorable change detection.
>>
>> 3. It won't work for C++ modules, unless you are prepared to go with the
>> "compiler calls back into the build system" approach.
>>
>> So I believe next-generation C++ build systems will instead use a
>> different compilation model (and build2 already does). This fragment
>> from my CppCon talk has the details:
>>
>> https://youtu.be/cJP7SSLjvSI?t=2332
>>
>>
>> > There may be some tweaks with the addition on modules, but it will still
>> > be applicable.
>>
>> I would be interested to learn how this will work.
>> _______________________________________________
>> Tooling mailing list
>> Tooling_at_[hidden]
>> http://www.open-std.org/mailman/listinfo/tooling
>>
> _______________________________________________
> Tooling mailing list
> Tooling_at_[hidden]
> http://www.open-std.org/mailman/listinfo/tooling
>

Received on 2019-01-02 14:41:11