Date: Fri, 16 May 2025 22:20:32 +0200
> On 16 May 2025, at 21:15, Jens Maurer <Jens.Maurer_at_[hidden]> wrote:
>
> On 16/05/2025 18.12, Hans Åberg via Std-Proposals wrote:
>>
>>> On 16 May 2025, at 14:20, Jan Schultke via Std-Proposals <std-proposals_at_[hidden]> wrote:
>>
>>> https://isocpp.org/files/papers/P3695R0.html will be in the next mailing.
>>>
>>> It seems like people almost universally dislike this implicit conversion, and overall, deprecation has been received very positively. However, I'd still be interested in your thoughts on this.
>>
>> One can have implicit conversions between the types std::string, std::u8string, and std::u32string, where std::u8string might guarantee valid UTF-8 sequences. Conversions from std::string would then throw exceptions for invalid UTF-8 sequences. No exceptions are needed for conversions between std::u8string and std::u32string.
>
> These are conversion between library-defined class types,
> which are totally unrelated to the three fundamental character
> types discussed in P3695R0.
Indeed, conversions only make sense at this level.
>> Possibly, implicit conversions between char8_t and char32_t cause problems here.
>
> How so? Why would the presence or absence of a conversion between T and U
> affect the design space of supporting or not supporting a conversion between
> some_class<T> to some_class<U>?
If one is using explicit char8_t which gets converted into std::u32string, but perhaps this is not a problem.
>
> On 16/05/2025 18.12, Hans Åberg via Std-Proposals wrote:
>>
>>> On 16 May 2025, at 14:20, Jan Schultke via Std-Proposals <std-proposals_at_[hidden]> wrote:
>>
>>> https://isocpp.org/files/papers/P3695R0.html will be in the next mailing.
>>>
>>> It seems like people almost universally dislike this implicit conversion, and overall, deprecation has been received very positively. However, I'd still be interested in your thoughts on this.
>>
>> One can have implicit conversions between the types std::string, std::u8string, and std::u32string, where std::u8string might guarantee valid UTF-8 sequences. Conversions from std::string would then throw exceptions for invalid UTF-8 sequences. No exceptions are needed for conversions between std::u8string and std::u32string.
>
> These are conversion between library-defined class types,
> which are totally unrelated to the three fundamental character
> types discussed in P3695R0.
Indeed, conversions only make sense at this level.
>> Possibly, implicit conversions between char8_t and char32_t cause problems here.
>
> How so? Why would the presence or absence of a conversion between T and U
> affect the design space of supporting or not supporting a conversion between
> some_class<T> to some_class<U>?
If one is using explicit char8_t which gets converted into std::u32string, but perhaps this is not a problem.
Received on 2025-05-16 20:20:46