Date: Wed, 27 Sep 2023 02:59:56 -0600
It seems like it should have been a 4-way enum in the first place.
Then, just one kind of ordering. Really still the same as {x,-1,0,1},
{undefined,less,equal,greater}, or {fish,dog,apple,cat}.
I'm not partial to magic numbers, and I don't care what values they assume.
The set proposed is as simple as I could make it, and doesn't require a
transformation from signum.
This proposal is partly a complaint about undue complexities brought about
by a zoo of presumptive return types and header (or module) coupling.
The rest of it should address the issues of needing a well-defined
comparison of heterogeneous containers, or something meainingful for a
collection.
In order to have a well-defined comparison of heterogeneous containers, a
single kind of ordering (outcome) is required.
Maybe transform orderings between pairs obtained by common_type<> into a
least_restrictive_ordering<>?
In cases where "partial_ordering::unordered" might have been used (relaxed
w.r.t. the set), the predicate could report "equal" and allow comparison to
iterate onward.
With an "undefined"-equipped ordering, the first result in a sequence that
is not "equal" is what determines how the two sets are related given that
sequence of examination.
Ultimately, a heterogeneous comparison must arrive at a _single_ outcome.
That outcome is going to be: "less" "equal" "greater" or "undefined".
"unordered" is somewhat opaque.
I understand it is there to mean "relaxed", like two strings of different
casing (or more, combining marks), where equivalence is a stronger
relationship.
If those strings are being sorted, the predicate can harmlessly report
equal.
Checking for a different kind of equality would require a different
predicate, which might not respect the one used to organize the container.
This is where the responsibilities of that container have become conflated
with storing a set of strings to providing an interface for different, but
still similarly ordered, examination.
The single responsibility principle would hold that we have two different
sets of strings, or string_view, pointing to the same data, but sorted
differently.
The orderings within those two sets cannot be modeled by a single
predicate, whether that is given as <=> or a compare().
An example: What happens when all things in two sets are related as
"unordered"? The algorithm or container might not be able to do anything
with those two sets.
Their relationship is undefined. If it can cope with this, it may leave
them in place while sorting other elements.
If it is OK for two things to be left in place during a sort, the predicate
can report "equal", even if it means "well, I'm not really sure".
When it comes to the concept of an "undefined" value, like NaN, the
relationship w.r.t. other NaN and values is also "undefined", not
"unordered".
Silly as it sounds, if you have a collection of floats and you want to sort
them, put the NaNs at one end. Let NaN == NaN, and NaN > x for any other x.
This _has_ to be done in a separate predicate.
A single allowed kind of return from <=> or compare() is not sufficient for
this purpose, which is what I gather here is the contrary to what it is
claimed to accomplish.
auto operator <=> ( auto &other )
{ return this->value <=> other.value; }
is single keyword away from
signed operator <=> ( auto &other )
{ ... }
and getting 6 automatically synthesized operators without any dependencies.
I'd like to just leve the return type auto, or as previously suggested, let
the compiler assume partial_ordering.
Mapping any value not in the set onto a reserved value does away with an
non-undefined result.
On that note, allowing enum class to somehow define conversion operators
for a similar purpose would be nice.
It will make that code non-interoperable with code that uses
> semantic-carrying return types. Or at the very least, such code would
> be required to assume the worst case (partial ordering). But even if
> this were adopted, I would suggest that every standard library type
> that interacts with a user-provided `T`'s comparison operator
> `static_assert` on that comparison operator, should it not return a
> semantic-carrying type.
You might really have to assume the worst regardless. The implied semantics
here might actually serve as the basis of a sort of "psychological set"
where you'd assume the rules were followed, but not really, leading to a
very long branch-and-prune search for why some constexpr thing isn't
building the way it should.
In synopsis:
SRP seems to require a predicate designed per container<T>, and a single
<=> for T makes that easier for a simple predicate, but not universally
applicable. Run-time changes make statically typed semantics hard to work
with *correctly*. The assumptions made by a compiler when handling
comparisons of different types for optimization are metadata already
associated with those types, and should possible to handle transitively
when dealing with operators synthesized from an arbitrary <=>, regardless
of the type it returns.
On Wed, Sep 27, 2023 at 1:12 AM Jason McKesson via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Tue, Sep 26, 2023 at 9:40 PM Chris Gary via Std-Proposals
> <std-proposals_at_[hidden]> wrote:
> >
> > Its hard for me to say what inspired many of the responses I received,
> aside from my clicking "reply to" and having a good bit of unproductive
> discussion vanish into personal conversations with others. Yet, here, it is
> not having problems putting your email on the CC list, and the message
> board on the main...
> >
> > We have a set of 3 things in std::strong_ordering, and an implicit 4th
> that is "not defined" or "unspecified". "Unspecified" is not an error, its
> just UB, which we've all come to know and work with. Proving that an
> arbitrary operator <=>, return type notwithstanding, will never result in
> UB is ultimately undecidable (also obvious).
>
> `strong_ordering` only permits 3 possible values; it has no "not
> defined" or "unspecified" value. Only `partial_ordering` allows 4
> values, and `partial_ordering` is not convertible to
> `strong_ordering`.
>
> This is part of why `float`-based `operator<=>` will return
> `std::partial_ordering`.
>
> > Wherever a user-defined operator <=> suddenly produces an int, the
> compiler can instead insert code to check against any of {-1,0,1} depending
> on what was synthesized.
>
> And what about if you pass that `int` to some other function instead
> of using it locally? A function that is not inline and therefore there
> is no way for the compiler to know that the `int` value it receives
> magically carries semantic information distinct from other `int`
> values.
>
> So that's just not a viable solution.
>
> If you want to have values that carry special semantic information
> distinct from other things, the standard way to do that in C++ is to
> have those values be of a different type from other values. This
> allows that semantic information to be non-local, to carry to whomever
> gets the value no matter how they get that value.
>
> That's what types are for. Like, carrying semantic information beyond
> the bag of bits used to store it is one of the most fundamental
> reasons why strong typing *exists*.
>
> You *could* remove `float` from the language and just have special
> operations on `int` that treat it as an IEEE-754 BINARY32 floating
> point value. But like... why? Why would you even want that? Why would
> you want to throw away semantic information that is vital to
> understanding what the code is actually doing?
>
> Because you don't like including a header?
>
> Also, why do you insist on limiting it to the [-1, 1] range? If you
> don't want semantic information carried in the type, if all semantics
> is to be implicit based on its use, why not allow the user to return
> whatever they want?
>
> > No need for a header or privileged symbols in namespace std. Again, I
> proposed this assuming a good deal of other things were implied.
> >
> > The comment about "Hacking into Clang..." was meant for another thread,
> the discussion in which seemed to creep into this one. At least a mention
> of "polyfill" was made in one post I responded to, and that ended up here.
> >
> > This won't break or interact with modules as far as I can tell. In
> general code, obviously something like decltype( a <=> b ), but then
> wherever that is being done, it probably doesn't care about specific
> names...
> >
> > To every complaint about "ABI breakage" etc... It won't break any
> existing code to allow new code to define an int-valued <=>. Its just a
> function by another name. The behavior of the compiler can be made
> well-defined in either case.
>
> It will make that code non-interoperable with code that uses
> semantic-carrying return types. Or at the very least, such code would
> be required to assume the worst case (partial ordering). But even if
> this were adopted, I would suggest that every standard library type
> that interacts with a user-provided `T`'s comparison operator
> `static_assert` on that comparison operator, should it not return a
> semantic-carrying type.
>
> This at least would prevent people from accidentally using this
> "feature" in reasonable code.
>
> > The comment in the opening about "all knowledge being integers" etc...
> also obvious, was meant to set the tone to "this is always possible,
> reasoning to the contrary must now confront an obvious truth." Why? I feel
> the decision to impose semantics through a return type is bad design, and
> thought the reasoning for its adoption was due to a narrow scope of
> understanding.
>
> Well, I feel that opinion is categorically incorrect. And the rest of
> the language seems to agree with me more than yourself, since it does
> this kind of thing all the time. Again, `float` is technically
> 32-bits, just like `int`, so everything special about it is just
> "semantics". But it has a distinct type. `pair<float, float>` is
> conceptually equivalent to `std::complex<float>`, but they are
> different types. Why?
>
> Because that's *what types are for*. The entire point of having types
> exist as a concept is to impose semantics upon raw values.
> `unique_ptr<T>` is just a set of semantics imposed on a pointer.
> Should code not return `unique_ptr<T>`s either?
>
> The thing about semantic information is this: there is frequently
> little benefit to *not* having it in the type system. If you
> semantically need to delete a pointer at the end of scope, there is
> basically no upside to using a raw pointer instead of `unique_ptr`.
> But there are *downsides* to using the type that doesn't carry the
> semantic information. You have to maintain that semantic information
> in your own head. And the heads of the people reading the code. It
> makes code more brittle and less readable.
>
> So besides your own personal preference against including standard
> library headers, I do not see any benefit to avoiding the C++
> comparison types. And I don't find your value statement of "impose
> semantics through a return type is bad design" to be a viable
> argument.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
Then, just one kind of ordering. Really still the same as {x,-1,0,1},
{undefined,less,equal,greater}, or {fish,dog,apple,cat}.
I'm not partial to magic numbers, and I don't care what values they assume.
The set proposed is as simple as I could make it, and doesn't require a
transformation from signum.
This proposal is partly a complaint about undue complexities brought about
by a zoo of presumptive return types and header (or module) coupling.
The rest of it should address the issues of needing a well-defined
comparison of heterogeneous containers, or something meainingful for a
collection.
In order to have a well-defined comparison of heterogeneous containers, a
single kind of ordering (outcome) is required.
Maybe transform orderings between pairs obtained by common_type<> into a
least_restrictive_ordering<>?
In cases where "partial_ordering::unordered" might have been used (relaxed
w.r.t. the set), the predicate could report "equal" and allow comparison to
iterate onward.
With an "undefined"-equipped ordering, the first result in a sequence that
is not "equal" is what determines how the two sets are related given that
sequence of examination.
Ultimately, a heterogeneous comparison must arrive at a _single_ outcome.
That outcome is going to be: "less" "equal" "greater" or "undefined".
"unordered" is somewhat opaque.
I understand it is there to mean "relaxed", like two strings of different
casing (or more, combining marks), where equivalence is a stronger
relationship.
If those strings are being sorted, the predicate can harmlessly report
equal.
Checking for a different kind of equality would require a different
predicate, which might not respect the one used to organize the container.
This is where the responsibilities of that container have become conflated
with storing a set of strings to providing an interface for different, but
still similarly ordered, examination.
The single responsibility principle would hold that we have two different
sets of strings, or string_view, pointing to the same data, but sorted
differently.
The orderings within those two sets cannot be modeled by a single
predicate, whether that is given as <=> or a compare().
An example: What happens when all things in two sets are related as
"unordered"? The algorithm or container might not be able to do anything
with those two sets.
Their relationship is undefined. If it can cope with this, it may leave
them in place while sorting other elements.
If it is OK for two things to be left in place during a sort, the predicate
can report "equal", even if it means "well, I'm not really sure".
When it comes to the concept of an "undefined" value, like NaN, the
relationship w.r.t. other NaN and values is also "undefined", not
"unordered".
Silly as it sounds, if you have a collection of floats and you want to sort
them, put the NaNs at one end. Let NaN == NaN, and NaN > x for any other x.
This _has_ to be done in a separate predicate.
A single allowed kind of return from <=> or compare() is not sufficient for
this purpose, which is what I gather here is the contrary to what it is
claimed to accomplish.
auto operator <=> ( auto &other )
{ return this->value <=> other.value; }
is single keyword away from
signed operator <=> ( auto &other )
{ ... }
and getting 6 automatically synthesized operators without any dependencies.
I'd like to just leve the return type auto, or as previously suggested, let
the compiler assume partial_ordering.
Mapping any value not in the set onto a reserved value does away with an
non-undefined result.
On that note, allowing enum class to somehow define conversion operators
for a similar purpose would be nice.
It will make that code non-interoperable with code that uses
> semantic-carrying return types. Or at the very least, such code would
> be required to assume the worst case (partial ordering). But even if
> this were adopted, I would suggest that every standard library type
> that interacts with a user-provided `T`'s comparison operator
> `static_assert` on that comparison operator, should it not return a
> semantic-carrying type.
You might really have to assume the worst regardless. The implied semantics
here might actually serve as the basis of a sort of "psychological set"
where you'd assume the rules were followed, but not really, leading to a
very long branch-and-prune search for why some constexpr thing isn't
building the way it should.
In synopsis:
SRP seems to require a predicate designed per container<T>, and a single
<=> for T makes that easier for a simple predicate, but not universally
applicable. Run-time changes make statically typed semantics hard to work
with *correctly*. The assumptions made by a compiler when handling
comparisons of different types for optimization are metadata already
associated with those types, and should possible to handle transitively
when dealing with operators synthesized from an arbitrary <=>, regardless
of the type it returns.
On Wed, Sep 27, 2023 at 1:12 AM Jason McKesson via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Tue, Sep 26, 2023 at 9:40 PM Chris Gary via Std-Proposals
> <std-proposals_at_[hidden]> wrote:
> >
> > Its hard for me to say what inspired many of the responses I received,
> aside from my clicking "reply to" and having a good bit of unproductive
> discussion vanish into personal conversations with others. Yet, here, it is
> not having problems putting your email on the CC list, and the message
> board on the main...
> >
> > We have a set of 3 things in std::strong_ordering, and an implicit 4th
> that is "not defined" or "unspecified". "Unspecified" is not an error, its
> just UB, which we've all come to know and work with. Proving that an
> arbitrary operator <=>, return type notwithstanding, will never result in
> UB is ultimately undecidable (also obvious).
>
> `strong_ordering` only permits 3 possible values; it has no "not
> defined" or "unspecified" value. Only `partial_ordering` allows 4
> values, and `partial_ordering` is not convertible to
> `strong_ordering`.
>
> This is part of why `float`-based `operator<=>` will return
> `std::partial_ordering`.
>
> > Wherever a user-defined operator <=> suddenly produces an int, the
> compiler can instead insert code to check against any of {-1,0,1} depending
> on what was synthesized.
>
> And what about if you pass that `int` to some other function instead
> of using it locally? A function that is not inline and therefore there
> is no way for the compiler to know that the `int` value it receives
> magically carries semantic information distinct from other `int`
> values.
>
> So that's just not a viable solution.
>
> If you want to have values that carry special semantic information
> distinct from other things, the standard way to do that in C++ is to
> have those values be of a different type from other values. This
> allows that semantic information to be non-local, to carry to whomever
> gets the value no matter how they get that value.
>
> That's what types are for. Like, carrying semantic information beyond
> the bag of bits used to store it is one of the most fundamental
> reasons why strong typing *exists*.
>
> You *could* remove `float` from the language and just have special
> operations on `int` that treat it as an IEEE-754 BINARY32 floating
> point value. But like... why? Why would you even want that? Why would
> you want to throw away semantic information that is vital to
> understanding what the code is actually doing?
>
> Because you don't like including a header?
>
> Also, why do you insist on limiting it to the [-1, 1] range? If you
> don't want semantic information carried in the type, if all semantics
> is to be implicit based on its use, why not allow the user to return
> whatever they want?
>
> > No need for a header or privileged symbols in namespace std. Again, I
> proposed this assuming a good deal of other things were implied.
> >
> > The comment about "Hacking into Clang..." was meant for another thread,
> the discussion in which seemed to creep into this one. At least a mention
> of "polyfill" was made in one post I responded to, and that ended up here.
> >
> > This won't break or interact with modules as far as I can tell. In
> general code, obviously something like decltype( a <=> b ), but then
> wherever that is being done, it probably doesn't care about specific
> names...
> >
> > To every complaint about "ABI breakage" etc... It won't break any
> existing code to allow new code to define an int-valued <=>. Its just a
> function by another name. The behavior of the compiler can be made
> well-defined in either case.
>
> It will make that code non-interoperable with code that uses
> semantic-carrying return types. Or at the very least, such code would
> be required to assume the worst case (partial ordering). But even if
> this were adopted, I would suggest that every standard library type
> that interacts with a user-provided `T`'s comparison operator
> `static_assert` on that comparison operator, should it not return a
> semantic-carrying type.
>
> This at least would prevent people from accidentally using this
> "feature" in reasonable code.
>
> > The comment in the opening about "all knowledge being integers" etc...
> also obvious, was meant to set the tone to "this is always possible,
> reasoning to the contrary must now confront an obvious truth." Why? I feel
> the decision to impose semantics through a return type is bad design, and
> thought the reasoning for its adoption was due to a narrow scope of
> understanding.
>
> Well, I feel that opinion is categorically incorrect. And the rest of
> the language seems to agree with me more than yourself, since it does
> this kind of thing all the time. Again, `float` is technically
> 32-bits, just like `int`, so everything special about it is just
> "semantics". But it has a distinct type. `pair<float, float>` is
> conceptually equivalent to `std::complex<float>`, but they are
> different types. Why?
>
> Because that's *what types are for*. The entire point of having types
> exist as a concept is to impose semantics upon raw values.
> `unique_ptr<T>` is just a set of semantics imposed on a pointer.
> Should code not return `unique_ptr<T>`s either?
>
> The thing about semantic information is this: there is frequently
> little benefit to *not* having it in the type system. If you
> semantically need to delete a pointer at the end of scope, there is
> basically no upside to using a raw pointer instead of `unique_ptr`.
> But there are *downsides* to using the type that doesn't carry the
> semantic information. You have to maintain that semantic information
> in your own head. And the heads of the people reading the code. It
> makes code more brittle and less readable.
>
> So besides your own personal preference against including standard
> library headers, I do not see any benefit to avoiding the C++
> comparison types. And I don't find your value statement of "impose
> semantics through a return type is bad design" to be a viable
> argument.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
Received on 2023-09-27 09:00:11