C++ Logo

sg15

Advanced search

Re: [Modules] [P3057] Two finer-grained compilation models for named modules

From: Bret Brown <mail_at_[hidden]>
Date: Tue, 21 Nov 2023 22:30:18 -0500
So pardon me for a bit of a tangent with respect to the excellent thread
here. My conclusion is mostly to agree with Ben about the use cases he
brings up, but for reasons that leave me dissatisfied as someone who cares
about excellence in development experience.

To get down to it, Ben's concerns about diagnostic configuration tweaks
invalidating parses of modules interfaces are further convincing me that
the best way to handle warnings is to just have the compiler emit a lot of
them in an unopinionated collection (like a SARIF file) and then let some
tool outside the compilation process itself (IDE, CI settings, a later
build system target, an engineer with a jq command) decide what to do about
it. Too much is lost in translation when turning high-level user direction
(e.g., "tell me interactively if I type any bugs into my editor") into the
relatively low context of a compilation command. In other words, we need
more precise tools closer to the user to really make a judgement call,
especially for warnings like deprecations, in which each instance of a `[[
deprecated ]]` is really its own issue to triage. Engineers need more
nuance than "fully ignore all deprecations", "all uses of deprecated
entities are errors", and "all uses of deprecated entities are warnings".

If someone wants a paper or talk exploring this subject, let me know. I'll
say I'm unsure that there's anything normative to do about it other than
maybe inventing more portable ways to suppress diagnostics. But maybe
toolchain maintainers can consider themselves absolved of certain
responsibilities if we can build up enough common understanding about how
compilers aren't going to do a whole lot better than a 3/5 star job of
appropriately treating warnings as errors.

Of course, I know a lot of engineers will only let go of their `-Werror
-Wall` if we pry them from their cold, dead fingers, so from a social
engineering perspective, the use cases Ben points out are effectively
requirements.

Bret


On Tue, Nov 21, 2023, 21:57 Ben Boeckel via SG15 <sg15_at_[hidden]>
wrote:

> On Tue, Nov 21, 2023 at 16:44:51 +0800, Chuanqi Xu via SG15 wrote:
> > It is a major concern that incremental build of modules may build too
> > many files if the interface units or the included files get changed.
> > To solve/mitigate the issue, I presented two solutions. See
> > https://isocpp.org/files/papers/P3057R0.html
> > <https://isocpp.org/files/papers/P3057R0.html > for details.
> > The demo tools in the compiler (clang) side are presented. So the
> > build tools are able to do some simple experiments.
> > Feedbacks or concerns are highly appreciated.
>
> IIUC, the "Used Files" strategy isn't going to work for `make` or
> `ninja` because there's no way to "mask" edges of dependencies.
>
> ("a -> b" is "a is used by b" or "b depends on a")
>
> stdio.h -> pmi.cppm -> pmi.pch -> use.cpp
>
> We'd need a way to specify "`.pch` usage ignores transitive dependencies
> for direct consumers" which has no spelling in either tool's syntax.
>
> The "Hash of declarations" strategy is workable with a `restat = 1`
> feature though. Something like this:
>
> c++ -o pmi.cppm -fmodule-output=pmi.pch.tmp &&
> copy_if_needed pmi.pch.tmp pmi.pch
>
> where (in verbose, but defensive, Bourne Shell):
>
> copy_if_needed () {
> # Grab parameters.
> local in="$1"
> readonly in
> shift
>
> local out="$1"
> readonly out
> shift
>
> # Whether to sync the timestamp or not.
> local preserve_mtime=false
> # Whether to update or not.
> local update=false
>
> if ! [ -f "$out" ]; then
> # No output? Update.
> update=true
> else
> local declhashin
> local declhashout
>
> # Compute the declhash of each file.
> declhashin="$( declhash "$in" )"
> declhashout="$( declhash "$out" )"
> readonly declhashain
> readonly declhashaout
>
> if ! [ "$declhashin" = "$declhashout" ]; then
> # Different declhash? Update.
> update=true
> elif ! cmp --quiet "$in" "$out"
> # The file is different; copy, but use the old
> # timestamp.
> update=true
> preserve_mtime=true
> fi
> fi
> readonly update
> readonly preserve_mtime
>
> if $preserve_mtime; then
> # Use the previous timestamp. Successful builds will be
> # skipped due to `restat = 1`. Failed builds will get new
> # diagnostics as needed though.
> touch -m --reference "$out" "$in"
> fi
> if $update; then
> # Replace the file.
> mv "$in" "$out"
> else
> # Remove the intermediate file.
> rm "$in"
> fi
> }
>
> would suffice for not rebuilding *successful* consumers while also
> ensuring that consumers that *have not yet succeeded* get updated
> diagnostics. This does mean that any diagnostic-affecting changes would
> need to be included in this declhash (e.g., `[[deprecated]]`
> attributes) as adding a deprecation should cause consumers to re-check
> their usage of any such declarations. I'm not sure what to do with
> consumers that have extant diagnostics…this would make it tough to
> iteratively fix warnings as any implementation edit will leave the
> declhash unchanged and the consumer won't give a fresh set of diagnostic
> output with the changes included. However, is it possible to craft
> something where:
>
> - a diagnostic comes "from" a module that is triggered by some usage
> pattern in the consumer
> - the fix doesn't change the declhash
> - the importer no longer has the diagnostic (or has more diagnostics)
>
> It's late and I'm not well-versed in module details beyond how to build
> them enough to come up with something right now, but if we can come up
> with such a scenario, something even more sophisticated will be
> necessary than the above function.
>
> --Ben
> _______________________________________________
> SG15 mailing list
> SG15_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/sg15
>

Received on 2023-11-22 03:30:29