C++ Logo


Advanced search

Re: utfN_view

From: Tom Honermann <tom_at_[hidden]>
Date: Mon, 17 Apr 2023 14:24:27 -0400
On 4/16/23 3:54 PM, Jens Maurer via SG16 wrote:
> Looking some more at the concepts, I'm wondering why the value types
> of the utfN_iters need to be exactly of the right size. After all,
> a code unit sequence is just a sequence of integers, and (on input)
> considering a sequence of 7-bit ASCII character values a valid UTF32
> sequence is sound. (When producing UTF-32 output, having just 7 bits
> doesn't work, of course.)

I agree the use of sizeof is a problem; actually, it just plain doesn't

Consider uint_least16_t; I know of real implementations where it has 16
bits and a size of 1, other ones where it has 16 bits and a size of 2,
and still other ones where it has 32-bits and a size of 1. The name of
the type alias suggests that it would probably be used for UTF-16, but
that isn't discernible from its size even if CHAR_BIT is factored in.


Received on 2023-04-17 18:24:28