Nicolas, consider

    template <typename T>
    using vec = std::vector<T>;

    template <typename T>
    void foo(vec<T> v);

When you pass `std::vector<int>{}` to `foo`, `T` gets deduced to `int`. This works, because `vec<T>` is equivalent to `std::vector<T>` and we can deduce `T` in `std::vector<T>` from `std::vector<int>`.

With your proposal, however, `vec<T>` is no longer necessarily equivalent to `std::vector<T>`. It is also not obvious how template argument deduction should be modified to ensure that `T` still gets deduced to `int` in my example. What do you propose?

On Fri, Nov 19, 2021 at 4:59 PM Nicolas Weidmann via Std-Proposals <std-proposals@lists.isocpp.org> wrote:
All I am saying is that alias templates are not as transparent to type deduction as written in your first response.

Then why allow the usage of dependent types in alias templates and allow something like std::type_identity_t?

Nicolas



Sent from my iPhone

On 19 Nov 2021, at 16:43, Arthur O'Dwyer <arthur.j.odwyer@gmail.com> wrote:


On Fri, Nov 19, 2021 at 10:31 AM Nicolas Weidmann <n.weidmann@bluewin.ch> wrote:
Based on your last example, we would get:

#include <vector>

#include <set>

#include <string>

 

template<typenamestruct no_default_alias3;

 

template<> struct no_default_alias3<int> {

   using type = std::vector<int>;

};

 

template<typename T> using alias3 = typename no_default_alias3<T>::type;

 

template<typename T> void bar(T a, alias3<T>) { }

 

template<> struct no_default_alias3<float> {

   using type = std::vector<int>;

};

 

template<> struct no_default_alias3<bool> {

   using type = std::set<std::string>;

};

 

void foo()

{

   bar(1.0f, std::vector<int>{});

   bar(true, std::set<std::string>{});

}


IIUC, you've just renamed `std::identity` to `no_default_alias3` and left everything else the same; is that right?
So your `no_default_alias3` template is serving as a "type deduction firewall," in the same way as `std::identity` does today.
And in particular, it is not acting in the same way as an alias template does today (because aliases do not serve as firewalls).

I think we're now going around in circles, but just in case, let me show you a program that directly compares the (transparent) behavior of aliases with the (firewalling) behavior of non-aliases.

template<class T> using Alias = T;
template<class T> struct Nonalias { using type = T; };

template<class T> void take_alias(Alias<T>);  // OK, equivalent to void take_alias(T); T is deducible
template<class T> void take_nonalias(typename Nonalias<T>::type);  // OK, but T is not deducible

int main() {
    take_alias(42);  // OK, T=int
    take_nonalias(42);  // does not compile, T can't be deduced
}

My impression of your original message/proposal is that you don't understand this fundamental difference between aliases and nonaliases, and you're basically proposing that either (1) some-or-all kinds of aliases should behave more like nonaliases, or (2) it should be possible to define (certain kinds of) nonaliases using a slight variation of alias syntax (instead of using the existing nonalias syntax).

HTH,
Arthur
--
Std-Proposals mailing list
Std-Proposals@lists.isocpp.org
https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals