Date: Fri, 2 Jun 2023 16:55:52 -0400
In my environment, well behaved builds use compilers and build time
dependencies that have either been checked into source control, or stored
in some zipped archive somewhere. These are copied / unpacked (but not
"installed") to the local machine in an early step. These are cached
between builds, but the machines are still reimaged on a fairly frequent
basis (once a week-ish).
My users regularly complain about transfer time for the compilers and build
time dependencies, as well as the disk usage. That's probably more the
build system's fault than anything else, as it has a bad tendency of
pulling down way more dependencies than it actually needs (like Linux only
dependencies, even during a Windows build), but the complaints exist
nonetheless.
On Fri, Jun 2, 2023 at 4:47 PM René Ferdinand Rivera Morell via SG15 <
sg15_at_[hidden]> wrote:
> On Fri, Jun 2, 2023 at 2:38 PM Tom Honermann <tom_at_[hidden]> wrote:
> >
> > On 6/2/23 2:52 PM, René Ferdinand Rivera Morell via SG15 wrote:
> > > I'll need to do some research on the possibility of having side data
> > > as a mechanism. I.e. go find out how it would work in the variety of
> > > build, packaging, and compiler tools that are available. But I'm
> > > wondering, since both Gaby and Olga mentioned it, if you can give
> > > examples of running such tools in size constrained environments and
> > > multiple tools needing to communicate. As I'm not familiar with
> > > running compilers, build systems, and package managers outside of
> > > desktop environments.
> >
> > CI deployments are often configured such that development tools are
> > deployed into a fresh OS image as part of job processing. The smaller
> > the size of the tools, the lower the setup overhead, the faster the job
> > throughput.
>
> Interesting.. The CI systems I work with are either in-house to build
> rather large Unreal Engine based projects, or cloud based ones to
> build & test (in comparison) small projects like Boost C++ Libraries.
> For the former the size & resources doesn't matter at the scale of
> adding a command line option in a tool having an impact on anything.
> As everything is pre-installed as it's impractical to do otherwise.
> And as some parts of it run in a local network distributed compute
> structure. For the cloud based ones, yes, most of the setup of OS and
> tools is pre-imaged somehow. So I guess I'm trying to understand what
> you mean by "fresh OS image". Does that mean you install an empty OS
> and then fresh install tools and then build/test? Something else?
>
>
> --
> -- René Ferdinand Rivera Morell
> -- Don't Assume Anything -- No Supone Nada
> -- Robot Dreams - http://robot-dreams.net
> _______________________________________________
> SG15 mailing list
> SG15_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/sg15
>
dependencies that have either been checked into source control, or stored
in some zipped archive somewhere. These are copied / unpacked (but not
"installed") to the local machine in an early step. These are cached
between builds, but the machines are still reimaged on a fairly frequent
basis (once a week-ish).
My users regularly complain about transfer time for the compilers and build
time dependencies, as well as the disk usage. That's probably more the
build system's fault than anything else, as it has a bad tendency of
pulling down way more dependencies than it actually needs (like Linux only
dependencies, even during a Windows build), but the complaints exist
nonetheless.
On Fri, Jun 2, 2023 at 4:47 PM René Ferdinand Rivera Morell via SG15 <
sg15_at_[hidden]> wrote:
> On Fri, Jun 2, 2023 at 2:38 PM Tom Honermann <tom_at_[hidden]> wrote:
> >
> > On 6/2/23 2:52 PM, René Ferdinand Rivera Morell via SG15 wrote:
> > > I'll need to do some research on the possibility of having side data
> > > as a mechanism. I.e. go find out how it would work in the variety of
> > > build, packaging, and compiler tools that are available. But I'm
> > > wondering, since both Gaby and Olga mentioned it, if you can give
> > > examples of running such tools in size constrained environments and
> > > multiple tools needing to communicate. As I'm not familiar with
> > > running compilers, build systems, and package managers outside of
> > > desktop environments.
> >
> > CI deployments are often configured such that development tools are
> > deployed into a fresh OS image as part of job processing. The smaller
> > the size of the tools, the lower the setup overhead, the faster the job
> > throughput.
>
> Interesting.. The CI systems I work with are either in-house to build
> rather large Unreal Engine based projects, or cloud based ones to
> build & test (in comparison) small projects like Boost C++ Libraries.
> For the former the size & resources doesn't matter at the scale of
> adding a command line option in a tool having an impact on anything.
> As everything is pre-installed as it's impractical to do otherwise.
> And as some parts of it run in a local network distributed compute
> structure. For the cloud based ones, yes, most of the setup of OS and
> tools is pre-imaged somehow. So I guess I'm trying to understand what
> you mean by "fresh OS image". Does that mean you install an empty OS
> and then fresh install tools and then build/test? Something else?
>
>
> --
> -- René Ferdinand Rivera Morell
> -- Don't Assume Anything -- No Supone Nada
> -- Robot Dreams - http://robot-dreams.net
> _______________________________________________
> SG15 mailing list
> SG15_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/sg15
>
Received on 2023-06-02 20:56:08