Date: Thu, 13 Mar 2025 23:32:25 +0000
On Thu, Mar 13, 2025 at 7:06 PM Jeremy Rifkin wrote:
>
> > Maybe we could make it:
> > std::array< char unsigned, 128u / CHAR_BIT >
> > or maybe even:
> > std::array< char unsigned, (128u / CHAR_BIT) + !!(128u % CHAR_BIT) >
> > but also provide an extra constexpr function to convert the array to
> > __uint128_t (where a 128-Bit integer type exists).
>
> But what if it doesn’t exist? Then you haven’t solved the fundamental problem with this interface.
The main type would be:
typedef std::array< char unsigned, (128u / CHAR_BIT) + !!(128u %
CHAR_BIT) > uuid_t;
And then we'd provide a constexpr function for the platforms that have
a 128-Bit integer type:
#ifdef __GNUC__
constexpr __uint128_t to_uint128(uuid_t const &);
#endif
although maybe std::bit_cast would be an alternative here.
> 128 random bits can provide *at most* 64 bits of collision resistance
> thanks to the birthday paradox. 2^64 is large, but not that large.
For us mere mortals here on Earth, it's accepted that 128 bits is
enough. There might be a species somewhere else in the universe that
have spread to multiple galaxies; I mean the universe is 13.8 billion
years old, and life started on Earth 3.5 billion years ago . . . so
maybe if a species started out 9 billion years ago and had billions of
years to spread outside their solar system, around their galaxy and
into neighbouring galaxies. Maybe that multigalactic species needs 256
bits. But for the rudimentary stuff we do here on Earth, we can
populate a database with 128-Bit randomly-generated identifiers and be
sure that we won't get a collision even if we combine all the world's
databases together into one.
Sorry I'm watching 'Signs' with Mel Gibson tonight. What a film. Up
there with the Sixth Sense.
But yeah 128-Bit is enough bits for humans to say "We won't get a
collision ever". That's why a UUID is 128-Bit.
>
> > Maybe we could make it:
> > std::array< char unsigned, 128u / CHAR_BIT >
> > or maybe even:
> > std::array< char unsigned, (128u / CHAR_BIT) + !!(128u % CHAR_BIT) >
> > but also provide an extra constexpr function to convert the array to
> > __uint128_t (where a 128-Bit integer type exists).
>
> But what if it doesn’t exist? Then you haven’t solved the fundamental problem with this interface.
The main type would be:
typedef std::array< char unsigned, (128u / CHAR_BIT) + !!(128u %
CHAR_BIT) > uuid_t;
And then we'd provide a constexpr function for the platforms that have
a 128-Bit integer type:
#ifdef __GNUC__
constexpr __uint128_t to_uint128(uuid_t const &);
#endif
although maybe std::bit_cast would be an alternative here.
> 128 random bits can provide *at most* 64 bits of collision resistance
> thanks to the birthday paradox. 2^64 is large, but not that large.
For us mere mortals here on Earth, it's accepted that 128 bits is
enough. There might be a species somewhere else in the universe that
have spread to multiple galaxies; I mean the universe is 13.8 billion
years old, and life started on Earth 3.5 billion years ago . . . so
maybe if a species started out 9 billion years ago and had billions of
years to spread outside their solar system, around their galaxy and
into neighbouring galaxies. Maybe that multigalactic species needs 256
bits. But for the rudimentary stuff we do here on Earth, we can
populate a database with 128-Bit randomly-generated identifiers and be
sure that we won't get a collision even if we combine all the world's
databases together into one.
Sorry I'm watching 'Signs' with Mel Gibson tonight. What a film. Up
there with the Sixth Sense.
But yeah 128-Bit is enough bits for humans to say "We won't get a
collision ever". That's why a UUID is 128-Bit.
Received on 2025-03-13 23:32:37