I think there’s a way to do the zip file that addresses those issues Steve.
When producing a zip file, the compiler would run the real preprocessor, then gather all the module source files. The zip file could then contain the primary file at the root of the zip file, then a directory per module, where the directory has the same name as the module. The directory would then contain one source file per-partition, with the file named in accordance with the module partition name. Difficulty: you lose correct file name info then… maybe stick a #line at the top of each of these copied / renamed files.
I don’t think you would transfer BMI artifacts this way. I think the main consumer of this would be text tools and implicit module generation, so you don’t need a build DAG.
I don’t feel that the compiler needs to be able to consume the contents of this zip file if they are not in the zip file (i.e. if they are laid out the same way on disk). It would be nice if they did, but I don’t think that’s a requirement.
I’d be happy with either pragmas or a zip-like file format. I’m also still open to other approaches.
From: Modules <email@example.com> On Behalf Of
Sent: Friday, March 8, 2019 7:28 AM
Cc: WG21 Tooling Study Group SG15 <firstname.lastname@example.org>; Nathan Sidwell <email@example.com>
Subject: [EXTERNAL] Re: [isocpp-modules] [Tooling] Round2: Path to modules with old bad build systems
If we use a zip file, which is essentially a filesystem, the compiler must be able to consume it fairly trivially, and should also be able to to consume something laid out in the same way on disk.
If this is a mechanism for transferring BMI artifacts, it also has security implications.
If it's transferring source interface units which need to be translated, it needs a build dag.
On Fri, Mar 8, 2019, 02:28 Gabriel Dos Reis via Modules <firstname.lastname@example.org> wrote:
I prefer the suggestion of a zip file containing the required files much better.
> On Mar 7, 2019, at 10:23 PM, Tom Honermann <email@example.com> wrote:
>> On 3/5/19 11:11 AM, Ben Craig wrote:
>> I will concede that static analysis tools and other tools that try to parse C++ probably don't need a textual inclusion format, since they most likely need to be able to parse and understand the pragmas anyway. If module mapping is sufficiently straightforward, then those tools can do module lookup the same as a compiler. Those tools already need to do include lookup in the same way that compilers do.
>> I think the textual inclusion format will still be very useful to distribution and caching tools though, as they don't need to understand the code. Those tools frequently lean on the compiler's preprocessor today, and don't know how to do include lookups.
> Another use case is reproducing issues encountered in the field. Static analysis tools like Coverity need to emulate other compilers. Today, when we fail to parse a TU that the emulated compiler accepts, we ask customers to send us preprocessed output for reproduction and analysis purposes. We ask for preprocessed output because that is much simpler to handle than the entire collection of included header files that must then be arranged according to some specific compiler invocation and set of include paths. We need a solution for this that works for modules as well. Clang's -frewrite-imports option so far seems to do the job for us and it uses #pragma directives in similar manner to those described here. I strongly favor specifying a common set of #pragma directives for this purpose.
>>> -----Original Message-----
>>> From: Nathan Sidwell <firstname.lastname@example.org> On Behalf Of Nathan
>>> Sent: Tuesday, March 5, 2019 6:04 AM
>>> To: Ben Craig <email@example.com>; firstname.lastname@example.org; WG21 Tooling
>>> Study Group SG15 <email@example.com>
>>> Subject: [EXTERNAL] Re: [isocpp-modules] Round2: Path to modules with old
>>> bad build systems
>>>> On 3/4/19 10:02 AM, Ben Craig wrote:
>>>> I do mean textual inclusion, though I can be convinced otherwise. Textual
>>> inclusion (with extra generated pragmas) should make it much easier to keep
>>> tools like distcc and cppcheck happy in the short term. I suspect that those
>>> tools don't want to crack open a BMI to figure out which other BMIs need to
>>> be found.
>>>> Tools that (think they can) parse C++ will still need to understand these
>>> pragmas in order to provide the right macro, visibility, and reachability
>>> behaviors, so some work will still be required on their part, but at least they
>>> won't need to understand new binary formats.
>>> Correct, tools consuming such #pragma-marked flattened source will need to
>>> understand modules at a fundamental level. As such, why not implement
>>> the same mechanisms to find module source as the compiler?
>>> That'll give them more information to perform code analysis with.
>>>>>> On 3/2/19 1:03 PM, Ben Craig wrote:
>>>>>> Some quick notes on this implementation strategy:
>>>>>> * Uses TEXTUAL inclusion
>>>>>> * Compiler assumes that the build system knows nothing of BMIs
>>>>>> * Compiler needs to be able to do module mapping with minimal input
>>>>>> from users.
>>>>> Do you literally mean textual inclusion or do you really mean
>>>>> dynamically produce an internal-only BMI object?
>>>>> Nathan Sidwell
>>> Nathan Sidwell
>> Tooling mailing list
> Modules mailing list
> Subscription: https://nam06.safelinks.protection.outlook.com/?url=http%3A%2F%2Flists.isocpp.org%2Fmailman%2Flistinfo.cgi%2Fmodules&data=02%7C01%7Cgdr%40microsoft.com%7C1cdb6c203df64eb7b84408d6a38ea3ab%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636876230368116236&sdata=UPJ70IqxSq6PeLpRdMc7IZCotDOeyV6n7QNxOXjYFQo%3D&reserved=0
> Link to this post: https://nam06.safelinks.protection.outlook.com/?url=http%3A%2F%2Flists.isocpp.org%2Fmodules%2F2019%2F03%2F0218.php&data=02%7C01%7Cgdr%40microsoft.com%7C1cdb6c203df64eb7b84408d6a38ea3ab%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636876230368116236&sdata=%2BsOB4DmF0Lf6EPmsPij3i%2BsDHDZS3ujHN4S9rVjMer4%3D&reserved=0
Modules mailing list
Link to this post: http://lists.isocpp.org/modules/2019/03/0220.php