Date: Fri, 16 Mar 2018 10:06:25 -0500
On Fri, Mar 16, 2018 at 4:59 AM, Corentin <corentin.jabot_at_[hidden]> wrote:
> I will be a bit cynical for a bit.
>
Constructive cynicism is welcome ;-)
There isn't anything in C++ as a language preventing us from having a good
> PDM.
> Of course, the preprocessor completely undermines that observation, but
> the preprocessor is not part of C++.
>
[cut]
> And that' s why we may fail to come up with a good dependency manager.
>
I don't see the correlation between the preprocessor and the package
manager. Can you explain how the PP technically "undermines" the
implementation of a PDM?
Not just because of the preprocessor, but because having a universal tool
> will require strong constraints on what a project can do in its build
> system.
>
That's clearly not true as we can see with conan, vcpkg, spack, nix, and
more, that manage to deal with all the variations of build systems. Albeit
they do it in a brute force manner. But that why we are here discussing how
to standardize the interactions between PDMs and build systems. What
restrictions are you contemplating?
We need to take away freedom.
>
We need to limit the number of compiler flags a library can use and impose
> to others, we need to limit the number of ways a given source file can be
> expressed.
>
Why?
> It's the only way to have a system of manageable complexity with a
> manageable numbers of unsolvable dependencies.
>
How? Current PDMs and build systems manage the complexity reasonably well
already. So I'm failing to see how you think it's unmanageable.
And we will shy away from that.
>
I surmise that you can only speak for yourself in that statement. As I
personally don't tend to shy away from complexities. In complexity I see
solutions.
The only way to have a large number of libraries that one can include in
> their projects from the same place, with the same straight forward workflow
> is to have those libraries play by the same rules.
>
Yes, we do need some level of common rules. But we are now getting into the
solutions ;-) I have a proposal I'm working to specify such a common set of
rules. But I'll leave that for a subsequent post.
> Other tools in other language impose those rules, they have limited
> abilities. Yet they are powerful enough that everybody use those tool with
> great effects.
> But those tools were there from the start, so, the code was made with
> these tools in mind.
>
C++ has rules.. The entirety of the standard is a, somewhat comprehensive,
ruleset.
> There is something to be said for monolithic ecosystems.
>
True.. monolithic ecosystem make it easier for if you are starting from
nothing. But they also unnecessarily lock you in by the choices that you
make to obtain that ease of implementation.
>From a user perspective, to install the package `foo` maybe they need to
> type a command or maybe the build system will do that for them.
> If the command does not exist on the system, we failed.
>
Thinking that they type a command is limiting ;-) They could alternative be
choosing a menu item in their IDE that lets them search through a database
of packages with keywords and sort by popularity. Or they could just use
"#include <xyz.hpp>", or import a module, and the build system or IDE will
automatically search for a package that has that dependency and ask you if
it can install that dependency. And I can think of many more variations of
that.. Hence, there are many ways we *don't* fail.
> If the correct command is not identical to the one indicated in the
> documentation of the project, we failed. If that documentation is longer
> than exactly one line in most cases, we failed, if the build system does
> not support it, we failed.
> (imagine : vcpkg get bar/foo / xcode-cpp-fetch baz.com/bar:foo / iso-get
> --from baz.com --org bar --import=export --package foo_at_version:<auto
> last> )
>
Any move towards adding more non-failure routes the better we are from the
current state of the ecosystem. And, yes, we will not be able to remove
failure. But C++ has never been about preventing you from shooting yourself
in the foot.
And, if there is anyway `foo` can resolve to different projects, we failed.
>
Cynical indeed ;-)
> Monolithic systems also exist because there is a need for "dependency foo"
> to have exactly one meaning.
>
> Doing a dependency manager is hard but as you said, it's a solved problem.
> Doing one able to handle the existing code may not be. It's harder. Maybe
> not solvable. Is it worth trying ? I don't know.
> Like Titus said, people need too learn what a well behaved dependency is.
>
We also need to define what kinds of dependencies we are willing to manager.
But if we decide to have an idealistic approach, will people adopt that new
> tool/protocol/standard ? Nobody like dealing with build systems, and
> whatever crappy CMake scripts they have today "works" - well, they think
> they do.
>
The hope is that: (a) vendors/maintainers would want to participate in the
common ecosystem and (b) that the cost of support is minimal and/or worth
it. But since this is something that users clearly want I'm fairly certain
we can get that support.
> Rust, for example did not endure 30 years of complete wild west jungle
> mess before offering those tools, that's the difference.
> Likewise, I'm doubtful that anyone in those communities would be
> successful in offering an alternative solution to their idiomatic tools.
>
I don't know if other communities can't or wont. And I don't think it's
worth our time to fathom their future. We have our own future to worry
about.
> So, really C++ is not the major issue.
>
Oh good!
> The major issue is the amount of C++ libraries out there all with overly
> complex, non portable build scripts, the unwillingness of their authors to
> support something new or their lack of resources to do so.
>
Funny thing.. It turns out conan is more popular than vcpkg. Yet conan has
less packages available. I can't entirely say why that is, but I suspect
that one aspect is the quality of the libraries available in conan.
> And a package manager without libraries is a bit pointless, isn't it ?
> Chicken egg problem and all.
>
It's technically not possible for a package manager to exist without
libraries. As they are required for the implementation. Even if it's a
small number. After that it's a matter of iterative growth.
> (I also tend to think that the publication of a given project in any kind
> of public repository is best left to their author, rather than a third
> party packager as that weakens the trust in the dependencies)
>
Trust it earned.. And it's possible for users to trust third party
packaging. Since we already have many examples of that level of trust.
> I will be a bit cynical for a bit.
>
Constructive cynicism is welcome ;-)
There isn't anything in C++ as a language preventing us from having a good
> PDM.
> Of course, the preprocessor completely undermines that observation, but
> the preprocessor is not part of C++.
>
[cut]
> And that' s why we may fail to come up with a good dependency manager.
>
I don't see the correlation between the preprocessor and the package
manager. Can you explain how the PP technically "undermines" the
implementation of a PDM?
Not just because of the preprocessor, but because having a universal tool
> will require strong constraints on what a project can do in its build
> system.
>
That's clearly not true as we can see with conan, vcpkg, spack, nix, and
more, that manage to deal with all the variations of build systems. Albeit
they do it in a brute force manner. But that why we are here discussing how
to standardize the interactions between PDMs and build systems. What
restrictions are you contemplating?
We need to take away freedom.
>
We need to limit the number of compiler flags a library can use and impose
> to others, we need to limit the number of ways a given source file can be
> expressed.
>
Why?
> It's the only way to have a system of manageable complexity with a
> manageable numbers of unsolvable dependencies.
>
How? Current PDMs and build systems manage the complexity reasonably well
already. So I'm failing to see how you think it's unmanageable.
And we will shy away from that.
>
I surmise that you can only speak for yourself in that statement. As I
personally don't tend to shy away from complexities. In complexity I see
solutions.
The only way to have a large number of libraries that one can include in
> their projects from the same place, with the same straight forward workflow
> is to have those libraries play by the same rules.
>
Yes, we do need some level of common rules. But we are now getting into the
solutions ;-) I have a proposal I'm working to specify such a common set of
rules. But I'll leave that for a subsequent post.
> Other tools in other language impose those rules, they have limited
> abilities. Yet they are powerful enough that everybody use those tool with
> great effects.
> But those tools were there from the start, so, the code was made with
> these tools in mind.
>
C++ has rules.. The entirety of the standard is a, somewhat comprehensive,
ruleset.
> There is something to be said for monolithic ecosystems.
>
True.. monolithic ecosystem make it easier for if you are starting from
nothing. But they also unnecessarily lock you in by the choices that you
make to obtain that ease of implementation.
>From a user perspective, to install the package `foo` maybe they need to
> type a command or maybe the build system will do that for them.
> If the command does not exist on the system, we failed.
>
Thinking that they type a command is limiting ;-) They could alternative be
choosing a menu item in their IDE that lets them search through a database
of packages with keywords and sort by popularity. Or they could just use
"#include <xyz.hpp>", or import a module, and the build system or IDE will
automatically search for a package that has that dependency and ask you if
it can install that dependency. And I can think of many more variations of
that.. Hence, there are many ways we *don't* fail.
> If the correct command is not identical to the one indicated in the
> documentation of the project, we failed. If that documentation is longer
> than exactly one line in most cases, we failed, if the build system does
> not support it, we failed.
> (imagine : vcpkg get bar/foo / xcode-cpp-fetch baz.com/bar:foo / iso-get
> --from baz.com --org bar --import=export --package foo_at_version:<auto
> last> )
>
Any move towards adding more non-failure routes the better we are from the
current state of the ecosystem. And, yes, we will not be able to remove
failure. But C++ has never been about preventing you from shooting yourself
in the foot.
And, if there is anyway `foo` can resolve to different projects, we failed.
>
Cynical indeed ;-)
> Monolithic systems also exist because there is a need for "dependency foo"
> to have exactly one meaning.
>
> Doing a dependency manager is hard but as you said, it's a solved problem.
> Doing one able to handle the existing code may not be. It's harder. Maybe
> not solvable. Is it worth trying ? I don't know.
> Like Titus said, people need too learn what a well behaved dependency is.
>
We also need to define what kinds of dependencies we are willing to manager.
But if we decide to have an idealistic approach, will people adopt that new
> tool/protocol/standard ? Nobody like dealing with build systems, and
> whatever crappy CMake scripts they have today "works" - well, they think
> they do.
>
The hope is that: (a) vendors/maintainers would want to participate in the
common ecosystem and (b) that the cost of support is minimal and/or worth
it. But since this is something that users clearly want I'm fairly certain
we can get that support.
> Rust, for example did not endure 30 years of complete wild west jungle
> mess before offering those tools, that's the difference.
> Likewise, I'm doubtful that anyone in those communities would be
> successful in offering an alternative solution to their idiomatic tools.
>
I don't know if other communities can't or wont. And I don't think it's
worth our time to fathom their future. We have our own future to worry
about.
> So, really C++ is not the major issue.
>
Oh good!
> The major issue is the amount of C++ libraries out there all with overly
> complex, non portable build scripts, the unwillingness of their authors to
> support something new or their lack of resources to do so.
>
Funny thing.. It turns out conan is more popular than vcpkg. Yet conan has
less packages available. I can't entirely say why that is, but I suspect
that one aspect is the quality of the libraries available in conan.
> And a package manager without libraries is a bit pointless, isn't it ?
> Chicken egg problem and all.
>
It's technically not possible for a package manager to exist without
libraries. As they are required for the implementation. Even if it's a
small number. After that it's a matter of iterative growth.
> (I also tend to think that the publication of a given project in any kind
> of public repository is best left to their author, rather than a third
> party packager as that weakens the trust in the dependencies)
>
Trust it earned.. And it's possible for users to trust third party
packaging. Since we already have many examples of that level of trust.
-- -- Rene Rivera -- Grafik - Don't Assume Anything -- Robot Dreams - http://robot-dreams.net
Received on 2018-03-16 16:06:28