Date: Mon, 1 Apr 2019 09:43:10 +0100
>> 1. It would solve the shared library problem for C and C++, without
>> using broken proprietary shared libraries to do it.
>
> I don't follow this explanation at all. Earlier you said that a
> program maps into memory some binary Modules.
Correct. Such a process bootstrapper basically traverses the DAG
generated by the binary linker, memory mapping each Module from the
system store of Modules into the process. That would be the bit we
specify in the standard. Implementations can, and of course would, do
the same thing much more efficiently.
> Binary modules are object files. Mapping object files into memory is
> exactly what my dynloader does today.
Correct.
> If you want to create some cloudy repository that "everybody" uses for
> their builds, you can do it with shared libraries.
> Many people already do. I don't see what "Modules need to become
> objects" has to do with any of that. If you expect
> programs to reach into a cloudy repository to magically update their
> "binary modules" somehow differently from
> updating their shared libraries, I don't see why we should expect that
> to happen any more than it happens with
> shared libraries.
>
> Would you like to try that explanation again?
Perhaps the disconnect is not understanding the difference between a
"shared binary Module" and a "shared ELF library"?
The former is a partially compiled and partially optimised AST with an
external interface specified in something very similar to whatever is
the latest version of IPR (Gaby's
https://github.com/GabrielDosReis/ipr). So the parts which can be
affected by the outside remain in AST form, whereas the parts which are
invariant to the outside get compiled into assembly and their AST
representation stripped.
The latter is a fully optimised, written in stone ELF binary with an
external interface specified as an array of variable length char strings.
The former are emitted early on the current compilation process. The
compiler goes as far as it can, and stops.
The latter are emitted at the very end of the current compilation
process, and ELF shared objects are not easily amenable to subsequent
analysis, transformation and optimisation.
I appreciate that the former doesn't exist. But as I mentioned, this is
a personal dream.
Niall
>> using broken proprietary shared libraries to do it.
>
> I don't follow this explanation at all. Earlier you said that a
> program maps into memory some binary Modules.
Correct. Such a process bootstrapper basically traverses the DAG
generated by the binary linker, memory mapping each Module from the
system store of Modules into the process. That would be the bit we
specify in the standard. Implementations can, and of course would, do
the same thing much more efficiently.
> Binary modules are object files. Mapping object files into memory is
> exactly what my dynloader does today.
Correct.
> If you want to create some cloudy repository that "everybody" uses for
> their builds, you can do it with shared libraries.
> Many people already do. I don't see what "Modules need to become
> objects" has to do with any of that. If you expect
> programs to reach into a cloudy repository to magically update their
> "binary modules" somehow differently from
> updating their shared libraries, I don't see why we should expect that
> to happen any more than it happens with
> shared libraries.
>
> Would you like to try that explanation again?
Perhaps the disconnect is not understanding the difference between a
"shared binary Module" and a "shared ELF library"?
The former is a partially compiled and partially optimised AST with an
external interface specified in something very similar to whatever is
the latest version of IPR (Gaby's
https://github.com/GabrielDosReis/ipr). So the parts which can be
affected by the outside remain in AST form, whereas the parts which are
invariant to the outside get compiled into assembly and their AST
representation stripped.
The latter is a fully optimised, written in stone ELF binary with an
external interface specified as an array of variable length char strings.
The former are emitted early on the current compilation process. The
compiler goes as far as it can, and stops.
The latter are emitted at the very end of the current compilation
process, and ELF shared objects are not easily amenable to subsequent
analysis, transformation and optimisation.
I appreciate that the former doesn't exist. But as I mentioned, this is
a personal dream.
Niall
Received on 2019-04-01 10:43:29