Date: Sun, 15 Dec 2019 12:34:15 -0500
On Sun, Dec 15, 2019 at 8:04 AM Martin Beeger <martin.beeger_at_[hidden]>
wrote:
> Am 15.12.19 um 03:36 schrieb Arthur O'Dwyer:
>
> On Sat, Dec 14, 2019 at 7:44 PM Martin Beeger via SG20 <
> sg20_at_[hidden]> wrote:
>
>> Concept will IMHO change how the think about template arguments and
>> template function radically, and will give us the power to express
>> semantic contracts much more clearly in definition.
>
>
> Well, IMHO the opposite. :) Ever since Stepanov in the mid-1990s, C++
> programmers have known the general outlines of "generic programming": that
> in C++ it's based on templates, that templates impose syntactic and
> semantic requirements on their type parameters, and that a bundle of type
> requirements can be called a "concept."
> C++11 allows us to write "concepts" fairly conveniently in the form of
> type-traits.
> Orthogonally, C++98 allows us to write function templates that SFINAE away
> for types that don't match their preferred concept (in the form of
> `enable_if`).
>
> I had a really good time reading "From Mathematics to Generic Programming"
> by Stepanov last year, which then motivated me to read most of "Elements of
> Programming" and although I somewhat knew about generic programming and
> Iterator concepts before, it really changed the way I think about writing
> templates. Armed with that knowlegde I tried to improve code quality by
> using enable_if as a concepts replacement. I first focussed on some key
> places, where to most lifting errors to compile time could be done. The
> results were, that it indeed catches error, but compile time explodes.
>
[Before I say anything else: I think we've solved your actual problem re:
`auto`, and now we're just trading long emails with relatively low stakes.
Which is fine and dandy. :)]
Agreed. Using `concept` and `requires` syntax should *help* with the
compile time explosion, but only by a linear factor. Doing SFINAE will
always be more expensive than not-doing-SFINAE, and "template code bloat"
will always be a thing.
It basically left us with the unfortunate choice of enable_if(ing) or code
> or compiling our code, but not both. I expect to be better off when Modules
> arrive, but until then, that in reality. And getting coworkers to think in
> terms of concepts when they are not at all enforced and completely
> invisible is hard.
>
In C++11, concepts don't have to be "completely invisible"; you can write
template<class FwdIt, std::enable_if_t<is_forward_iterator_v<FwdIt>,
int> = 0>
void foo(FwdIt first, FwdIt last) { ... }
which makes the "enable if" a.k.a. "requires" condition pretty clear and
also makes the name of the concept "forward_iterator" pretty clear. Not *as*
clear as C++2a Concepts syntax, but not what I'd call "completely
invisible." And definitely not "not at all enforced"! The whole point of
code is that if we express it in code then we are enforcing it, by
definition. "Enforcement" is the only thing that distinguishes *code* from
*documentation*. :)
Your problems pre-C++2a seem to be:
- in C++11, concepts are *optional*; it's too easy to write an
unconstrained template
- in C++11, concepts are *expensive*; the compile-time cost is prohibitive
And then your concern about C++2a Concepts seems to be:
- in C++2a, concepts are *optional*; it remains easy to write an
unconstrained template
to which I'm adding:
- in C++2a, concepts are *expensive*; the compile-time cost is non-zero
C++2a is certainly a change in degree (degree of syntactic convenience,
degree of compile-time cost), but it's not a fundamental change in kind,
and therefore we shouldn't look for C++2a to trigger a quantum leap in how
and why we write generic code. Templates will still be templates and SFINAE
will still be SFINAE.
Interestingly Stepanovs book lead me already before concepts arrived to use
> templates less and localize them, by defining (or including from abseil)
> semantic types and program against valid semantic non-template types like
> duration instead of having templates and overload sets for different
> duration types (double seconds, int64_t ticks etc.).
>
Hear hear. I haven't played around too much with Abseil's `Duration` type,
but it certainly seems to match human intuition better than what ended up
as <chrono> in C++11. For anyone in this thread who hasn't seen it, I
highly recommend Hyrum Wright's CppCon 2019 talk on migrating to
`Duration`: "Time Travel: Applying Gradual Typing to Time Types with
Clang's LibTooling" <https://www.youtube.com/watch?v=VN0VNoykxtk>
If you have templates in your codebase today that are not SFINAE-friendly —
> that are not std::enable_if'ed on some type-trait that describes their
> syntactic requirements — then you should ask yourself, "Why?" Why would a
> C++11 programmer occasionally write an "unconstrained" template, even as he
> knows that it must impose some kind of requirement and that the requirement
> must be expressible in C++?
>
> As I said above: compile time. enable_if in about 20 core places doubles
> my compile time. type_traits and enable if template metaprogramming costs a
> lot.
>
Well, there you go. :) If you have access to a compiler that supports
C++2a Concepts syntax, I'd love to hear some data points:
- Compile time with no SFINAE on any of those 20 places, with your current
production compiler
- Compile time with enable_if on all of those 20 places, with your current
production compiler
- Compile time with no SFINAE on any of those 20 places, with the
concepts-enabled compiler
- Compile time with enable_if on all of those 20 places, with the
concepts-enabled compiler
- Compile time with `requires` (and real `concept`s) on all of those 20
places, with the concepts-enabled compiler
My expectation is that #5 would be faster than #4, but that #3 would be
fastest of all (and incidentally that #4 might be faster than #2).
It would be great if you could tell me if my expectation was correct or
incorrect. :)
Off the top of my head:
> (1) The constraint is so complicated that it's not worth the effort to
> specify. For example, imagine the variadic version of this template:
> template<class F, class T1, class T2, class T3>
> bool all_permutations(F f, T1 t, T2 u, T3 v) {
> return f(t,u,v) && f(t,v,u) && f(u,t,v) && f(u,v,t) && f(v,t,u) &&
> f(v,u,t);
> }
> It's probably possible to work out the C++ equivalent of "F is callable
> with all permutations of Ts...", but the cost/benefit analysis is against
> it.
>
> This smells to me. You are doing something complicated, but general. So
> try to find an implementation elsewhere.
>
That sounds like a "truly generic" counterargument. ;) Why am I writing a
C++ program at all, when I could just find a program someone else had
written and use that program instead?
I admit I made up this example; I don't know of any real-world use-case for
"call F with all permutations of Ts...", let alone SFINAEing on whether F
is callable with all permutations of `Ts...`. But can we agree that in this
contrived example, it is much much easier to "document the syntactic
requirement in English" than to "express the syntactic requirement in C++"?
and that C++2a Concepts don't alter that fact?
I'm 90% confident that what you recall from Stepanov was about
swappability/permutability of sequences of values, and has no bearing at
all on what I'm doing here with `f`.
> (2) The function does not need to mess with overload resolution, and it is
> so trivial that its type-requirement is no more or less than that its body
> compiles. For example:
> template<class T, class U> bool equals(T t, U u) { return t == u; }
>
> This function is just a wrapper. You rename equality and inherit all the
> requirements. C++ hat no way to inherit/lift concepts, so yeah wrapper
> functions are a good answer to my questions. Your function requires
> copyability on top, but that can be fixed. There is a class of
> "semantically inactive" functions, like wrapper functions which themselves
> to not have concept requirements. I did not see that, thank you.
>
Pedantic nits:
- "C++ has no way to inherit/lift concepts" — Yeah, I commiserate. I wish
C++2a Concepts had thought about how to fit Concepts in with the rest of
the language (perfect forwarding, member concepts, template concept
parameters...). I'm aware of blog posts by Barry Revzin and by myself on
the subject. (The counterargument is that this is Concepts "Lite". It is
deliberately scoped to be a sugary replacement for enable_if and not much
more than that. Not *exactly* that, but not *much *more.)
- "Your function requires copyability on top" — Nope. `equals` never
attempts to copy a T. It is a *value sink* for T objects — if you move a
unique_ptr into it, then you're giving up that unique_ptr for good — but it
would be incorrect and over-constraining to claim that it "requires"
copyability. Remember my std::copy example! (On the other hand, `equals`
does require *destructibility*.)
(3) The function does not need to mess with overload resolution, and for
> usability's sake, it prefers to static_assert its type-requirements rather
> than SFINAE away. For example:
> template<class Socket> void read_from_socket(Socket s) {
> static_assert(is_socket_v<Socket>); ... }
>
> As for such a function the definition and declaration would usually
> seperate (as it is not a one-liner) and if I program against it reading
> its declaration, I will use them wrong.
>
That's a valid point: by putting concept constraints into the template
declaration, you help the set of people who both (1) treat the declaration
as documentation and (2) do not treat the definition as documentation.
Which is definitely a non-zero set of people. But you might be able to tell
from my phrasing that I don't think those people are entirely in the right.
;)
In this case, the function's name indicates its intended use:
"read_from_socket(Socket s)". If the client programmer tries to pass
something that's not `is_socket`, then they deserve what they get! And
then, repeating something I said on #2 (but snipped from this reply), it's
debatable whether the client programmer will be helped more by an error
message pointing directly at the static_assert, or an error message
pointing at the call site and claiming that "no viable overload was found."
(In the latter case there will be one or more "notes" below the error,
pointing at the non-viable template declaration and explaining the problem
in more detail.)
The API-design question is: As the writer of the read_from_socket
component, am I confident that the caller intends to call my implementation
and is using it wrong? In that case, static_assert. Or is it plausible
that the caller intends to call some *other* implementation of
read_from_socket and this argument-list was not intended for me at all? In
that case, I need to get out of the way — SFINAE away via `enable_if` or
`requires`.
The API-design tradeoff between "hard error" and "SFINAE" is analogous to
the API-design tradeoff between "non-virtual function" and "virtual
function." In both cases, the first option is the implementor saying "This
is how you do the task. Period." The second option permits the implementor
to say "I suggest doing the task this way; but I suspect that my coworker
might know an even better way to do it." I haven't fully thought out the
corresponding parts of this analogy; particularly, with `virtual`, your
coworker is writing the "child" class, but with SFINAE it seems like your
coworker might equally or more likely be writing a "parent" (more general)
implementation?
(4) The compile-time cost of SFINAE is so large that the codebase has
> guidelines suggesting "no unnecessary SFINAE." Notice that many templates
> are used as implementation details. For example:
> template<class Iter> void sort_helper(Iter a, Iter b,
> std::random_access_tag) { ... }
> template<class Iter> requires is_swappable_v<iterator_value_t<Iter>>
> void sort(Iter a, Iter b) { sort_helper(a, b, iterator_category_t<Iter>{});
> }
> I think most template programmers would agree that it would be
> counterproductive to repeat the same constraints on `sort_helper` that were
> already applied on `sort`. For one thing, it violates the Don't Repeat
> Yourself principle; but more importantly, it costs compile time and brings
> no benefit in return.
>
> This is again a case of not repeating conecpts in semantically inactive
> wrapper functions, I get that. This specific wrapper function should no
> longer be needed with C++20, am I right?
>
The answer is always "it depends," but yeah, you can use C++2a subsumption
as a substitute for tag dispatch if you want.
Me personally, I would retain the tag dispatch. I think it's usually
important to have a single entry point for the caller (`sort` in this
example) and then hide the "dispatch" behind that single entry point,
rather than permit a thicket of entry points that the caller has to puzzle
through.
Again, there is an analogy you could draw to classical virtual functions.
"Single entry point, dispatch on the back-end" is the Non-Virtual Interface
<http://www.gotw.ca/publications/mill18.htm> idiom. "Many entry points,
dispatch at the call site" is the classical, Java-style public-virtual
idiom.
[...] A significantly large codebase (where "significantly" is up for
> debate) can only afford a limited number of template function
> instantiations. Otherwise its compile time will blow up anyway. I would
> rather afford a few less template functions, with well defined named
> concepts instead of a larger number of template function w/o them.
> Hopefully modules can alleviate some of this tension.
>
IIUC, your codebase has functions like
A() B() C() - unconstrained function templates D() E() F() -
constrained templates G() H() - plain old functions
and you are saying that it's conceivable to turn some templates back into
plain old functions, in order to "afford" new constraints on a few more
templates?
A() - unconstrained function template C() D() E() F() - constrained
templates B() G() H() - plain old functions
If you are willing to eat the runtime cost (if any) of turning B() into a
plain old function, then why haven't you gone ahead and made B() a plain
old function already, simply to improve your compile times?
A() C() - unconstrained function template D() E() F() - constrained
templates B() G() H() - plain old functions
I'm probably oversimplifying; I'll accept "the real code isn't that simple"
as sufficient explanation. ;) But it might be food for thought. As you said
above, Abseil::Duration gets a lot of benefit from its non-template design.
Modules might help. In my last email, I said "good luck getting the
compiler to tell which templates are part of your public API and which are
implementation details"; but it's possible that Modules might help with
*that*, by enabling heuristics like "any template that's exported from a
module is 'public' and any non-exported template is 'private'."
Off the top of my head, I would not expect Modules to help with the cost of
template instantiation/codegen, and I would not expect Modules to help very
much with the cost of overload resolution/SFINAE. Those costs are incurred
after parsing, so they should be pretty much the same no matter whether the
code originally came from a header file or a module file.
Modules *might* be able to improve overload resolution times by *reducing
the number of overloads considered*; I don't know if "smaller overload
sets" is supposed to be a benefit of Modules or not. OTOH, in this thread
we're talking about constraining templates even where the size of the
overload set is already 1. Adding a constraint to a template adds work for
the compiler to do during overload resolution, period.
[...] The next
>> candidate was the identity function, but for the identity function to be
>> correct, its result must be identical to the input, which only has valid
>> meaning if the input and the output is actually equality comparable,
>> which is a valid requires clause.
>>
>
> Here I absolutely disagree. The `identity` function
> template<class T> T identity(T t) { return t; }
> requires that T be move-constructible and destructible, but absolutely
> does not require that T be equality-comparable. For example, `identity`
> remains a useful operation even when T is `std::function<void()>`.
> `std::function<void()>` is not, and cannot ever be made,
> equality-comparable.
>
> The type-requirements of a generic algorithm are driven by what the
> algorithm *actually does*. It requires what it *does* to be well-formed.
> My kneejerk impression is that no generic algorithm should ever "require"
> some capability of its type parameter if its implementation does not depend
> on that capability.
>
> But for a non-comparable type template<class T> T identity(T t) { return T
> {}; } is often a faster implementation. It may not be what the programmer
> expected, but how to write test?
>
That's a faster implementation, but it implements the wrong thing!
How to write a unit test for `identity`?... How about this?
TEST(Identity, works_without_equality) {
std::function<int(int)> f = [](int x) { return x + 1; };
auto g = identity(f);
EXPECT_EQ(3, g(2));
EXPECT_EQ(7, g(6));
}
"Testing cannot prove the absence of bugs
<https://en.wikiquote.org/wiki/Edsger_W._Dijkstra>," but this unit test
suffices to distinguish the correct `identity` from the specific impostor
that you provided.
I think the identity function on non-comparable types would not be
> "provable" in Lisa Lippincott's sense.
>
(I don't know exactly what that sense is, but I have general vague
recollections of her talks such as "The Truth of a Procedure"
<https://www.youtube.com/watch?v=baKqCOLKcPc>. If you know a direct link to
a definition of "provable" in the sense you mean, I'd appreciate it.)
Vice versa, one could say that without the ability to do foolproof
post-facto testing with operator==, the only way a skeptic could ever be
confident that `identity` actually had the right implementation would be if
its implementation contained a proof that rigorously *proved* that the
return value was equivalent to the parameter value. In a sense, it already
does carry that proof, but the proof is intelligible only to a human
programmer, not to a computer. In order to get the computer to agree that
`identity(x)` is always the same as `x`, we'd have to teach the computer
what "is the same as" means, and that's a notoriously hard problem
<https://en.wikiquote.org/wiki/Bill_Clinton#1990s>.
To take a really extreme and probably singular example, imagine what the
> STL would have been like if we'd said, "The std::copy algorithm requires
> the expression `std::addressof(*dest)` to have pointer type." That's
> "obvious," right, because the point of std::copy is to be a type-safe
> replacement for memcpy, and objects always have addresses? ...But then we
> never would have gotten ostream_iterator or back_insert_iterator!
>
> The copy function is defined without mention of memcpy. A copy function is
> a function which takes Range T and produces a new range T2 where the
> following two postconditions hold: The range T is unchanged and the range
> is T2 equal to the original range T.
>
Ah, so std::copy cannot modify its source range? You would cut off the
development of std::move_iterator?
And you would require the ranges to be comparable? You would cut off the
development of std::istream_iterator and std::ostream_iterator, and
incidentally forbid using std::copy to copy a range of std::function
objects? :)
Again, I'm picking a particularly "wild west" and "generic" algorithm for
this example. If you've got a function template named `read_from_socket`
and you're slightly overconstraining it, well, you're probably not going to
lose out on some awesome clever feature on the level of istream_iterator.
But one of the benefits of templates is that we *can* leave them
unconstrained and just "see what develops," if we want (for good and bad).
For example, if your English documentation says that all
random-access-iterators should provide both operator[] and postfix
operator++, but in practice, in real code, no algorithm uses that facility
and no type provides it, then that's useful feedback that you can use to
feed back into your design and say hey, maybe random-access-iterators
*shouldn't* need to provide either operator[] or postfix operator++. If
you constrain all your algorithms so that they do not compile without a
postfix operator++ (even though their implementation doesn't use it), then
you're making extra work for all your coworkers who write
random-access-iterators. And since that work is now *required*, you never
get a chance to measure whether people are doing it organically.
Again I'm sure there's an analogy to classical OOP: "Keep your required
virtual API as small as possible." "There should be One Way To Do It."
I suspect that a sufficiently motivated programmer could always find a
> *semantic* requirement that you failed to capture in your naming scheme.
> ;) Even something as simple as "fclose(fp) expects fp to be a file that is
> open, not a file that has already been closed." But maybe it's
> domain-dependent; maybe "no hidden semantic requirements" actually works in
> your particular codebase.
>
> [...] If fp was an RAII object which invariant was: for the object to
> exist, a open file must exist, the semantic requirement of fclose would be
> to get a valid FP object of which its invariant is satifisfied, et voila,
> you are done. You cannot correct missing invariant on the type definition
> of a function parameter at the function declaration. Get your types right
> and RAII for the win.
>
Okay, you win that one. ;) It occurs to me that when we talk about
semantic requirements, we are sometimes talking about Contracts, not
Concepts. Consider:
v.at(i) // semantic requirement: i's value must be in the range
[0..v.size())
sort(v.begin(), v.begin() + 10) // semantic requirement: there must
be no NaNs in the given range
Pushing runtime checks into compile-time is what C++ is all about; and
pushing semantics into syntax is what *programming* is all about. Still,
there has to be *something* that happens at runtime, or else we'd never
bother to run the program. ;)
[...big snip of the part where your problem is actually solved, and thus
we're just arguing for the sake of argument at this point, which is great
:) ...]
my $.02,
–Arthur
wrote:
> Am 15.12.19 um 03:36 schrieb Arthur O'Dwyer:
>
> On Sat, Dec 14, 2019 at 7:44 PM Martin Beeger via SG20 <
> sg20_at_[hidden]> wrote:
>
>> Concept will IMHO change how the think about template arguments and
>> template function radically, and will give us the power to express
>> semantic contracts much more clearly in definition.
>
>
> Well, IMHO the opposite. :) Ever since Stepanov in the mid-1990s, C++
> programmers have known the general outlines of "generic programming": that
> in C++ it's based on templates, that templates impose syntactic and
> semantic requirements on their type parameters, and that a bundle of type
> requirements can be called a "concept."
> C++11 allows us to write "concepts" fairly conveniently in the form of
> type-traits.
> Orthogonally, C++98 allows us to write function templates that SFINAE away
> for types that don't match their preferred concept (in the form of
> `enable_if`).
>
> I had a really good time reading "From Mathematics to Generic Programming"
> by Stepanov last year, which then motivated me to read most of "Elements of
> Programming" and although I somewhat knew about generic programming and
> Iterator concepts before, it really changed the way I think about writing
> templates. Armed with that knowlegde I tried to improve code quality by
> using enable_if as a concepts replacement. I first focussed on some key
> places, where to most lifting errors to compile time could be done. The
> results were, that it indeed catches error, but compile time explodes.
>
[Before I say anything else: I think we've solved your actual problem re:
`auto`, and now we're just trading long emails with relatively low stakes.
Which is fine and dandy. :)]
Agreed. Using `concept` and `requires` syntax should *help* with the
compile time explosion, but only by a linear factor. Doing SFINAE will
always be more expensive than not-doing-SFINAE, and "template code bloat"
will always be a thing.
It basically left us with the unfortunate choice of enable_if(ing) or code
> or compiling our code, but not both. I expect to be better off when Modules
> arrive, but until then, that in reality. And getting coworkers to think in
> terms of concepts when they are not at all enforced and completely
> invisible is hard.
>
In C++11, concepts don't have to be "completely invisible"; you can write
template<class FwdIt, std::enable_if_t<is_forward_iterator_v<FwdIt>,
int> = 0>
void foo(FwdIt first, FwdIt last) { ... }
which makes the "enable if" a.k.a. "requires" condition pretty clear and
also makes the name of the concept "forward_iterator" pretty clear. Not *as*
clear as C++2a Concepts syntax, but not what I'd call "completely
invisible." And definitely not "not at all enforced"! The whole point of
code is that if we express it in code then we are enforcing it, by
definition. "Enforcement" is the only thing that distinguishes *code* from
*documentation*. :)
Your problems pre-C++2a seem to be:
- in C++11, concepts are *optional*; it's too easy to write an
unconstrained template
- in C++11, concepts are *expensive*; the compile-time cost is prohibitive
And then your concern about C++2a Concepts seems to be:
- in C++2a, concepts are *optional*; it remains easy to write an
unconstrained template
to which I'm adding:
- in C++2a, concepts are *expensive*; the compile-time cost is non-zero
C++2a is certainly a change in degree (degree of syntactic convenience,
degree of compile-time cost), but it's not a fundamental change in kind,
and therefore we shouldn't look for C++2a to trigger a quantum leap in how
and why we write generic code. Templates will still be templates and SFINAE
will still be SFINAE.
Interestingly Stepanovs book lead me already before concepts arrived to use
> templates less and localize them, by defining (or including from abseil)
> semantic types and program against valid semantic non-template types like
> duration instead of having templates and overload sets for different
> duration types (double seconds, int64_t ticks etc.).
>
Hear hear. I haven't played around too much with Abseil's `Duration` type,
but it certainly seems to match human intuition better than what ended up
as <chrono> in C++11. For anyone in this thread who hasn't seen it, I
highly recommend Hyrum Wright's CppCon 2019 talk on migrating to
`Duration`: "Time Travel: Applying Gradual Typing to Time Types with
Clang's LibTooling" <https://www.youtube.com/watch?v=VN0VNoykxtk>
If you have templates in your codebase today that are not SFINAE-friendly —
> that are not std::enable_if'ed on some type-trait that describes their
> syntactic requirements — then you should ask yourself, "Why?" Why would a
> C++11 programmer occasionally write an "unconstrained" template, even as he
> knows that it must impose some kind of requirement and that the requirement
> must be expressible in C++?
>
> As I said above: compile time. enable_if in about 20 core places doubles
> my compile time. type_traits and enable if template metaprogramming costs a
> lot.
>
Well, there you go. :) If you have access to a compiler that supports
C++2a Concepts syntax, I'd love to hear some data points:
- Compile time with no SFINAE on any of those 20 places, with your current
production compiler
- Compile time with enable_if on all of those 20 places, with your current
production compiler
- Compile time with no SFINAE on any of those 20 places, with the
concepts-enabled compiler
- Compile time with enable_if on all of those 20 places, with the
concepts-enabled compiler
- Compile time with `requires` (and real `concept`s) on all of those 20
places, with the concepts-enabled compiler
My expectation is that #5 would be faster than #4, but that #3 would be
fastest of all (and incidentally that #4 might be faster than #2).
It would be great if you could tell me if my expectation was correct or
incorrect. :)
Off the top of my head:
> (1) The constraint is so complicated that it's not worth the effort to
> specify. For example, imagine the variadic version of this template:
> template<class F, class T1, class T2, class T3>
> bool all_permutations(F f, T1 t, T2 u, T3 v) {
> return f(t,u,v) && f(t,v,u) && f(u,t,v) && f(u,v,t) && f(v,t,u) &&
> f(v,u,t);
> }
> It's probably possible to work out the C++ equivalent of "F is callable
> with all permutations of Ts...", but the cost/benefit analysis is against
> it.
>
> This smells to me. You are doing something complicated, but general. So
> try to find an implementation elsewhere.
>
That sounds like a "truly generic" counterargument. ;) Why am I writing a
C++ program at all, when I could just find a program someone else had
written and use that program instead?
I admit I made up this example; I don't know of any real-world use-case for
"call F with all permutations of Ts...", let alone SFINAEing on whether F
is callable with all permutations of `Ts...`. But can we agree that in this
contrived example, it is much much easier to "document the syntactic
requirement in English" than to "express the syntactic requirement in C++"?
and that C++2a Concepts don't alter that fact?
I'm 90% confident that what you recall from Stepanov was about
swappability/permutability of sequences of values, and has no bearing at
all on what I'm doing here with `f`.
> (2) The function does not need to mess with overload resolution, and it is
> so trivial that its type-requirement is no more or less than that its body
> compiles. For example:
> template<class T, class U> bool equals(T t, U u) { return t == u; }
>
> This function is just a wrapper. You rename equality and inherit all the
> requirements. C++ hat no way to inherit/lift concepts, so yeah wrapper
> functions are a good answer to my questions. Your function requires
> copyability on top, but that can be fixed. There is a class of
> "semantically inactive" functions, like wrapper functions which themselves
> to not have concept requirements. I did not see that, thank you.
>
Pedantic nits:
- "C++ has no way to inherit/lift concepts" — Yeah, I commiserate. I wish
C++2a Concepts had thought about how to fit Concepts in with the rest of
the language (perfect forwarding, member concepts, template concept
parameters...). I'm aware of blog posts by Barry Revzin and by myself on
the subject. (The counterargument is that this is Concepts "Lite". It is
deliberately scoped to be a sugary replacement for enable_if and not much
more than that. Not *exactly* that, but not *much *more.)
- "Your function requires copyability on top" — Nope. `equals` never
attempts to copy a T. It is a *value sink* for T objects — if you move a
unique_ptr into it, then you're giving up that unique_ptr for good — but it
would be incorrect and over-constraining to claim that it "requires"
copyability. Remember my std::copy example! (On the other hand, `equals`
does require *destructibility*.)
(3) The function does not need to mess with overload resolution, and for
> usability's sake, it prefers to static_assert its type-requirements rather
> than SFINAE away. For example:
> template<class Socket> void read_from_socket(Socket s) {
> static_assert(is_socket_v<Socket>); ... }
>
> As for such a function the definition and declaration would usually
> seperate (as it is not a one-liner) and if I program against it reading
> its declaration, I will use them wrong.
>
That's a valid point: by putting concept constraints into the template
declaration, you help the set of people who both (1) treat the declaration
as documentation and (2) do not treat the definition as documentation.
Which is definitely a non-zero set of people. But you might be able to tell
from my phrasing that I don't think those people are entirely in the right.
;)
In this case, the function's name indicates its intended use:
"read_from_socket(Socket s)". If the client programmer tries to pass
something that's not `is_socket`, then they deserve what they get! And
then, repeating something I said on #2 (but snipped from this reply), it's
debatable whether the client programmer will be helped more by an error
message pointing directly at the static_assert, or an error message
pointing at the call site and claiming that "no viable overload was found."
(In the latter case there will be one or more "notes" below the error,
pointing at the non-viable template declaration and explaining the problem
in more detail.)
The API-design question is: As the writer of the read_from_socket
component, am I confident that the caller intends to call my implementation
and is using it wrong? In that case, static_assert. Or is it plausible
that the caller intends to call some *other* implementation of
read_from_socket and this argument-list was not intended for me at all? In
that case, I need to get out of the way — SFINAE away via `enable_if` or
`requires`.
The API-design tradeoff between "hard error" and "SFINAE" is analogous to
the API-design tradeoff between "non-virtual function" and "virtual
function." In both cases, the first option is the implementor saying "This
is how you do the task. Period." The second option permits the implementor
to say "I suggest doing the task this way; but I suspect that my coworker
might know an even better way to do it." I haven't fully thought out the
corresponding parts of this analogy; particularly, with `virtual`, your
coworker is writing the "child" class, but with SFINAE it seems like your
coworker might equally or more likely be writing a "parent" (more general)
implementation?
(4) The compile-time cost of SFINAE is so large that the codebase has
> guidelines suggesting "no unnecessary SFINAE." Notice that many templates
> are used as implementation details. For example:
> template<class Iter> void sort_helper(Iter a, Iter b,
> std::random_access_tag) { ... }
> template<class Iter> requires is_swappable_v<iterator_value_t<Iter>>
> void sort(Iter a, Iter b) { sort_helper(a, b, iterator_category_t<Iter>{});
> }
> I think most template programmers would agree that it would be
> counterproductive to repeat the same constraints on `sort_helper` that were
> already applied on `sort`. For one thing, it violates the Don't Repeat
> Yourself principle; but more importantly, it costs compile time and brings
> no benefit in return.
>
> This is again a case of not repeating conecpts in semantically inactive
> wrapper functions, I get that. This specific wrapper function should no
> longer be needed with C++20, am I right?
>
The answer is always "it depends," but yeah, you can use C++2a subsumption
as a substitute for tag dispatch if you want.
Me personally, I would retain the tag dispatch. I think it's usually
important to have a single entry point for the caller (`sort` in this
example) and then hide the "dispatch" behind that single entry point,
rather than permit a thicket of entry points that the caller has to puzzle
through.
Again, there is an analogy you could draw to classical virtual functions.
"Single entry point, dispatch on the back-end" is the Non-Virtual Interface
<http://www.gotw.ca/publications/mill18.htm> idiom. "Many entry points,
dispatch at the call site" is the classical, Java-style public-virtual
idiom.
[...] A significantly large codebase (where "significantly" is up for
> debate) can only afford a limited number of template function
> instantiations. Otherwise its compile time will blow up anyway. I would
> rather afford a few less template functions, with well defined named
> concepts instead of a larger number of template function w/o them.
> Hopefully modules can alleviate some of this tension.
>
IIUC, your codebase has functions like
A() B() C() - unconstrained function templates D() E() F() -
constrained templates G() H() - plain old functions
and you are saying that it's conceivable to turn some templates back into
plain old functions, in order to "afford" new constraints on a few more
templates?
A() - unconstrained function template C() D() E() F() - constrained
templates B() G() H() - plain old functions
If you are willing to eat the runtime cost (if any) of turning B() into a
plain old function, then why haven't you gone ahead and made B() a plain
old function already, simply to improve your compile times?
A() C() - unconstrained function template D() E() F() - constrained
templates B() G() H() - plain old functions
I'm probably oversimplifying; I'll accept "the real code isn't that simple"
as sufficient explanation. ;) But it might be food for thought. As you said
above, Abseil::Duration gets a lot of benefit from its non-template design.
Modules might help. In my last email, I said "good luck getting the
compiler to tell which templates are part of your public API and which are
implementation details"; but it's possible that Modules might help with
*that*, by enabling heuristics like "any template that's exported from a
module is 'public' and any non-exported template is 'private'."
Off the top of my head, I would not expect Modules to help with the cost of
template instantiation/codegen, and I would not expect Modules to help very
much with the cost of overload resolution/SFINAE. Those costs are incurred
after parsing, so they should be pretty much the same no matter whether the
code originally came from a header file or a module file.
Modules *might* be able to improve overload resolution times by *reducing
the number of overloads considered*; I don't know if "smaller overload
sets" is supposed to be a benefit of Modules or not. OTOH, in this thread
we're talking about constraining templates even where the size of the
overload set is already 1. Adding a constraint to a template adds work for
the compiler to do during overload resolution, period.
[...] The next
>> candidate was the identity function, but for the identity function to be
>> correct, its result must be identical to the input, which only has valid
>> meaning if the input and the output is actually equality comparable,
>> which is a valid requires clause.
>>
>
> Here I absolutely disagree. The `identity` function
> template<class T> T identity(T t) { return t; }
> requires that T be move-constructible and destructible, but absolutely
> does not require that T be equality-comparable. For example, `identity`
> remains a useful operation even when T is `std::function<void()>`.
> `std::function<void()>` is not, and cannot ever be made,
> equality-comparable.
>
> The type-requirements of a generic algorithm are driven by what the
> algorithm *actually does*. It requires what it *does* to be well-formed.
> My kneejerk impression is that no generic algorithm should ever "require"
> some capability of its type parameter if its implementation does not depend
> on that capability.
>
> But for a non-comparable type template<class T> T identity(T t) { return T
> {}; } is often a faster implementation. It may not be what the programmer
> expected, but how to write test?
>
That's a faster implementation, but it implements the wrong thing!
How to write a unit test for `identity`?... How about this?
TEST(Identity, works_without_equality) {
std::function<int(int)> f = [](int x) { return x + 1; };
auto g = identity(f);
EXPECT_EQ(3, g(2));
EXPECT_EQ(7, g(6));
}
"Testing cannot prove the absence of bugs
<https://en.wikiquote.org/wiki/Edsger_W._Dijkstra>," but this unit test
suffices to distinguish the correct `identity` from the specific impostor
that you provided.
I think the identity function on non-comparable types would not be
> "provable" in Lisa Lippincott's sense.
>
(I don't know exactly what that sense is, but I have general vague
recollections of her talks such as "The Truth of a Procedure"
<https://www.youtube.com/watch?v=baKqCOLKcPc>. If you know a direct link to
a definition of "provable" in the sense you mean, I'd appreciate it.)
Vice versa, one could say that without the ability to do foolproof
post-facto testing with operator==, the only way a skeptic could ever be
confident that `identity` actually had the right implementation would be if
its implementation contained a proof that rigorously *proved* that the
return value was equivalent to the parameter value. In a sense, it already
does carry that proof, but the proof is intelligible only to a human
programmer, not to a computer. In order to get the computer to agree that
`identity(x)` is always the same as `x`, we'd have to teach the computer
what "is the same as" means, and that's a notoriously hard problem
<https://en.wikiquote.org/wiki/Bill_Clinton#1990s>.
To take a really extreme and probably singular example, imagine what the
> STL would have been like if we'd said, "The std::copy algorithm requires
> the expression `std::addressof(*dest)` to have pointer type." That's
> "obvious," right, because the point of std::copy is to be a type-safe
> replacement for memcpy, and objects always have addresses? ...But then we
> never would have gotten ostream_iterator or back_insert_iterator!
>
> The copy function is defined without mention of memcpy. A copy function is
> a function which takes Range T and produces a new range T2 where the
> following two postconditions hold: The range T is unchanged and the range
> is T2 equal to the original range T.
>
Ah, so std::copy cannot modify its source range? You would cut off the
development of std::move_iterator?
And you would require the ranges to be comparable? You would cut off the
development of std::istream_iterator and std::ostream_iterator, and
incidentally forbid using std::copy to copy a range of std::function
objects? :)
Again, I'm picking a particularly "wild west" and "generic" algorithm for
this example. If you've got a function template named `read_from_socket`
and you're slightly overconstraining it, well, you're probably not going to
lose out on some awesome clever feature on the level of istream_iterator.
But one of the benefits of templates is that we *can* leave them
unconstrained and just "see what develops," if we want (for good and bad).
For example, if your English documentation says that all
random-access-iterators should provide both operator[] and postfix
operator++, but in practice, in real code, no algorithm uses that facility
and no type provides it, then that's useful feedback that you can use to
feed back into your design and say hey, maybe random-access-iterators
*shouldn't* need to provide either operator[] or postfix operator++. If
you constrain all your algorithms so that they do not compile without a
postfix operator++ (even though their implementation doesn't use it), then
you're making extra work for all your coworkers who write
random-access-iterators. And since that work is now *required*, you never
get a chance to measure whether people are doing it organically.
Again I'm sure there's an analogy to classical OOP: "Keep your required
virtual API as small as possible." "There should be One Way To Do It."
I suspect that a sufficiently motivated programmer could always find a
> *semantic* requirement that you failed to capture in your naming scheme.
> ;) Even something as simple as "fclose(fp) expects fp to be a file that is
> open, not a file that has already been closed." But maybe it's
> domain-dependent; maybe "no hidden semantic requirements" actually works in
> your particular codebase.
>
> [...] If fp was an RAII object which invariant was: for the object to
> exist, a open file must exist, the semantic requirement of fclose would be
> to get a valid FP object of which its invariant is satifisfied, et voila,
> you are done. You cannot correct missing invariant on the type definition
> of a function parameter at the function declaration. Get your types right
> and RAII for the win.
>
Okay, you win that one. ;) It occurs to me that when we talk about
semantic requirements, we are sometimes talking about Contracts, not
Concepts. Consider:
v.at(i) // semantic requirement: i's value must be in the range
[0..v.size())
sort(v.begin(), v.begin() + 10) // semantic requirement: there must
be no NaNs in the given range
Pushing runtime checks into compile-time is what C++ is all about; and
pushing semantics into syntax is what *programming* is all about. Still,
there has to be *something* that happens at runtime, or else we'd never
bother to run the program. ;)
[...big snip of the part where your problem is actually solved, and thus
we're just arguing for the sake of argument at this point, which is great
:) ...]
my $.02,
–Arthur
Received on 2019-12-15 11:36:54