To add to that, 
we assume there are many more modules (TU) than libraries, and as such manually specifying dependencies within source files will be deemed a huge maintenance burden by a lot of people
(a reason why header only libraries are popular) and would make build scripts and build systems even more complicated than they are, when we should probably strive to make them simpler.

To my knowledge, no other language requires to explicitly specify dependencies at that level of granularity - artifacts depends on packages/libraries, not individual files ( ie: module)


On Thu, 31 Jan 2019 at 23:58 Ben Craig <ben.craig@ni.com> wrote:

“If my build system has already encoded all of my dependencies in a secondary layer (my build files), do I care about any of these concerns? “

Channeling Tom Honermann, but yes, your other tools care.  Your IDEs and tidying tools and anything else that needs to consume and understand your source without building it will care.  If you want types and values to have different colors in your IDE, then your IDE needs to be able to get from your file to the source file that declared those entities.

 

“I'm reasonably sure from past experience is that a proper modular and parallized build requires that secondary layer. (To say nothing of performance concerns: rescanning constantly is a waste of computation.)”

It may be the case that such concessions are necessary.  However, I will argue that a significant portion of C++ projects don’t currently specify in their build files that foo.cpp depends on foo.h, boost/spirit.h, and the other hundred headers that boost/spirit.h drags in.  I will go further and say that a significant portion of C++ projects rely on file globbing to even gather the list of .cpp files to compile.  We should at least be aware of this cost and consider it, before requiring everyone to duplicate the build information that already exists in their source files and file system.

 

From: tooling-bounces@open-std.org <tooling-bounces@open-std.org> On Behalf Of Titus Winters
Sent: Thursday, January 31, 2019 1:28 PM
To: WG21 Tooling Study Group SG15 <tooling@open-std.org>
Subject: [Tooling] Modules

 

(Sorry for having to drop off, scheduling conflict.)

 

To clarify a little and ensure I understand properly:

 

If my build system has already encoded all of my dependencies in a secondary layer (my build files), do I care about any of these concerns? 

 

Or, put another way: is the current debate about "I want the build to be synthesized at compile-invocation time from source, without that secondary layer"? 

 

I'm reasonably sure from past experience is that a proper modular and parallized build requires that secondary layer. (To say nothing of performance concerns: rescanning constantly is a waste of computation.)

 

Or am I misunderstanding?

_______________________________________________
Tooling mailing list
Tooling@isocpp.open-std.org
http://www.open-std.org/mailman/listinfo/tooling