Date: Sun, 24 Aug 2025 22:44:54 -0700
On Sunday, 24 August 2025 07:31:46 Pacific Daylight Time Simon Schröder via
Std-Proposals wrote:
> However, as you have mentioned, right now for this I still use char and not
> char8_t. As all Windows functions already work with the .utf8 locale, it
> would be easy to provide support for char8_t as well. (Maybe they need to
> change functions under the hood to take the locale explicitly instead of
> relying on the global locale.)
I don't think Microsoft will. They already are maintaining dual APIs of
everything, so increasing that by 50% is not going to make them more money,
not when the issue is all the legacy code that expects ANSI 8-bit.
From their point of view, the native API is UTF-16 and has been supported for
30 years like that. And so is the native Cocoa on macOS. As are Qt and ICU.
In my very biased opinion, C++ should just embrace that and accept that text
is UTF-16 with char16_t and stop pushing both of char8_t and wchar_t.
Std-Proposals wrote:
> However, as you have mentioned, right now for this I still use char and not
> char8_t. As all Windows functions already work with the .utf8 locale, it
> would be easy to provide support for char8_t as well. (Maybe they need to
> change functions under the hood to take the locale explicitly instead of
> relying on the global locale.)
I don't think Microsoft will. They already are maintaining dual APIs of
everything, so increasing that by 50% is not going to make them more money,
not when the issue is all the legacy code that expects ANSI 8-bit.
From their point of view, the native API is UTF-16 and has been supported for
30 years like that. And so is the native Cocoa on macOS. As are Qt and ICU.
In my very biased opinion, C++ should just embrace that and accept that text
is UTF-16 with char16_t and stop pushing both of char8_t and wchar_t.
-- Thiago Macieira - thiago (AT) macieira.info - thiago (AT) kde.org Principal Engineer - Intel Platform & System Engineering
Received on 2025-08-25 05:44:59