Even if compilers will optimize the code out, all users will have the pay the cost of slower compilation (because compiler still needs to instantiate the serialization code for big/little endian).
Basically with current design this is an early pessimization.

You should focus on providing low-level primitives first.
Maybe something like Codec concept that allows serializing primitive types (instead of enum format member in context concept), and provide two implementations BigEndianCodec, LittleEndianCodec - and allow users to provide custom ones.
With your current design I see no easy way to provide custom serialization codec (e.g. Type-Length-Value).

pt., 6 mar 2020 o 12:04 Lyberta via Std-Proposals <std-proposals@lists.isocpp.org> napisał(a):
Maciej Cencora:
> Exactly as I said, endianess in your example is a runtime property, while
> in most scenarios it is actually known at compile time.
> It should be something like:
> std::io::default_big_endian_context ctx{stream};
> or
> std::io_default_context<std::endian::big> ctx{stream};
>

I'm pretty sure we can avoid doing explicit NTTP because:

1. Serialization functions should be templates by default.
2. Once compilers implement `std::bit_cast`, everything except system calls in file IO should become constexpr.

I think that templates + constexpr is enough for compiler to see that the developer hard-codes the value of endianness and to
remove dead branches.

--
Std-Proposals mailing list
Std-Proposals@lists.isocpp.org
https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals