Date: Sun, 24 Sep 2023 15:03:31 -0700
On Sunday, 24 September 2023 13:57:38 PDT Chris Gary via Std-Proposals wrote:
> enum class Relation : int
> {
> Undefined = -2,
> Less = -1,
> Equal = 0,
> Greater = 1
> };
>
> inline constexpr Relation ClampToValid( Relation r ) noexcept
> {
> const auto u = (int)r + 1;
> const auto v = 2 - u;
> const auto w = u | ((u|v) >> (bitCount<int>-1));
> return (Relation)( w - 1 );
> }
>
> // Relational operators that take "Relation" etc...
> `
> Now, if <=> could just return `int` or anything that can round-trip to an
> `int` implicitly, *and* enum classes could expose implicit conversions,
> this could all be safely encapsulated without creating incorrect code that
> tries to compare a `Relation` value and not getting the magic `undefined`
> result - everything would get mapped into to the set {-2,-1,0,1} before
> comparison. The "sign of difference" rule still applies everywhere, and
> comparisons are allowed to return any other random value to mean
> "undefined". In cases where the meaning of the return value means more than
> just "undefined", and the code still assumes the semantics of "Relation",
> it wasn't written to deal with the failure anyway (generally, just capture
> the original int and propagate an error/throw an exception).
I'm missing something.
How is this any different to std::strong_ordering?
The implementations do use values -1, 0, 1 and something else (2, -2, -127,
etc.). They just happen to wrap them in a class that can relate to
partial_ordering, weak_ordering.
So what's the problem with the current implementation?
> enum class Relation : int
> {
> Undefined = -2,
> Less = -1,
> Equal = 0,
> Greater = 1
> };
>
> inline constexpr Relation ClampToValid( Relation r ) noexcept
> {
> const auto u = (int)r + 1;
> const auto v = 2 - u;
> const auto w = u | ((u|v) >> (bitCount<int>-1));
> return (Relation)( w - 1 );
> }
>
> // Relational operators that take "Relation" etc...
> `
> Now, if <=> could just return `int` or anything that can round-trip to an
> `int` implicitly, *and* enum classes could expose implicit conversions,
> this could all be safely encapsulated without creating incorrect code that
> tries to compare a `Relation` value and not getting the magic `undefined`
> result - everything would get mapped into to the set {-2,-1,0,1} before
> comparison. The "sign of difference" rule still applies everywhere, and
> comparisons are allowed to return any other random value to mean
> "undefined". In cases where the meaning of the return value means more than
> just "undefined", and the code still assumes the semantics of "Relation",
> it wasn't written to deal with the failure anyway (generally, just capture
> the original int and propagate an error/throw an exception).
I'm missing something.
How is this any different to std::strong_ordering?
The implementations do use values -1, 0, 1 and something else (2, -2, -127,
etc.). They just happen to wrap them in a class that can relate to
partial_ordering, weak_ordering.
So what's the problem with the current implementation?
-- Thiago Macieira - thiago (AT) macieira.info - thiago (AT) kde.org Software Architect - Intel DCAI Cloud Engineering
Received on 2023-09-24 22:03:34