Date: Thu, 13 Mar 2025 23:54:07 -0500
> And then we'd provide a constexpr function for the platforms that have a
128-Bit integer type
You still haven’t addressed how you think this will be used on platforms
that don’t have such a type.
> For us mere mortals here on Earth, it's accepted that 128 bits is enough.
There might be a species somewhere else in the universe that have spread to
multiple galaxies; I mean the universe is 13.8 billion
You’ve gone from making claims about cryptography, mathematicians, and
being unique at the scale of the universe to talking about “mere mortals on
earth.” Again, I recommend being specific about the criteria and guarantees
you want instead of hand-waving.
> But for the rudimentary stuff we do here on Earth, we can populate a
database with 128-Bit randomly-generated identifiers and be sure that we
won't get a collision even if we combine all the world's databases together
into one.
According to Rivery, all the world’s data as of 2024 was about 149ZB. 2^64
bytes is 0.001 of that. And that number is only going up and accelerating. As
of 2023 amazon AWS *alone* held 280 trillion items. This is only a few
orders of magnitude off of 2^64. Just so you understand the scale of the
claims you’re making. Again, 128 random bits fundamentally can only provide
at most 64 bits of collision resistance and 2^64 is only the point at which
there’s a 50% chance of collision. You can still have substantial chances
of collision at much lower amounts.
I hope you see why your claim below is not correct to say during this day
and age:
> But yeah 128-Bit is enough bits for humans to say "We won't get a collision
ever".
Once again, I recommend being really specific about your goals, what
functionality you’re trying to provide, what guarantees you want to make
about random number quality, and what use cases you want to serve.
Cheers,
Jeremy
On Thu, Mar 13, 2025 at 18:32 Frederick Virchanza Gotham via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Thu, Mar 13, 2025 at 7:06 PM Jeremy Rifkin wrote:
> >
> > > Maybe we could make it:
> > > std::array< char unsigned, 128u / CHAR_BIT >
> > > or maybe even:
> > > std::array< char unsigned, (128u / CHAR_BIT) + !!(128u % CHAR_BIT) >
> > > but also provide an extra constexpr function to convert the array to
> > > __uint128_t (where a 128-Bit integer type exists).
> >
> > But what if it doesn’t exist? Then you haven’t solved the fundamental
> problem with this interface.
>
>
>
> The main type would be:
>
> typedef std::array< char unsigned, (128u / CHAR_BIT) + !!(128u %
> CHAR_BIT) > uuid_t;
>
> And then we'd provide a constexpr function for the platforms that have
> a 128-Bit integer type:
>
> #ifdef __GNUC__
> constexpr __uint128_t to_uint128(uuid_t const &);
> #endif
>
> although maybe std::bit_cast would be an alternative here.
>
>
>
> > 128 random bits can provide *at most* 64 bits of collision resistance
> > thanks to the birthday paradox. 2^64 is large, but not that large.
>
>
>
> For us mere mortals here on Earth, it's accepted that 128 bits is
> enough. There might be a species somewhere else in the universe that
> have spread to multiple galaxies; I mean the universe is 13.8 billion
> years old, and life started on Earth 3.5 billion years ago . . . so
> maybe if a species started out 9 billion years ago and had billions of
> years to spread outside their solar system, around their galaxy and
> into neighbouring galaxies. Maybe that multigalactic species needs 256
> bits. But for the rudimentary stuff we do here on Earth, we can
> populate a database with 128-Bit randomly-generated identifiers and be
> sure that we won't get a collision even if we combine all the world's
> databases together into one.
>
> Sorry I'm watching 'Signs' with Mel Gibson tonight. What a film. Up
> there with the Sixth Sense.
>
> But yeah 128-Bit is enough bits for humans to say "We won't get a
> collision ever". That's why a UUID is 128-Bit.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
128-Bit integer type
You still haven’t addressed how you think this will be used on platforms
that don’t have such a type.
> For us mere mortals here on Earth, it's accepted that 128 bits is enough.
There might be a species somewhere else in the universe that have spread to
multiple galaxies; I mean the universe is 13.8 billion
You’ve gone from making claims about cryptography, mathematicians, and
being unique at the scale of the universe to talking about “mere mortals on
earth.” Again, I recommend being specific about the criteria and guarantees
you want instead of hand-waving.
> But for the rudimentary stuff we do here on Earth, we can populate a
database with 128-Bit randomly-generated identifiers and be sure that we
won't get a collision even if we combine all the world's databases together
into one.
According to Rivery, all the world’s data as of 2024 was about 149ZB. 2^64
bytes is 0.001 of that. And that number is only going up and accelerating. As
of 2023 amazon AWS *alone* held 280 trillion items. This is only a few
orders of magnitude off of 2^64. Just so you understand the scale of the
claims you’re making. Again, 128 random bits fundamentally can only provide
at most 64 bits of collision resistance and 2^64 is only the point at which
there’s a 50% chance of collision. You can still have substantial chances
of collision at much lower amounts.
I hope you see why your claim below is not correct to say during this day
and age:
> But yeah 128-Bit is enough bits for humans to say "We won't get a collision
ever".
Once again, I recommend being really specific about your goals, what
functionality you’re trying to provide, what guarantees you want to make
about random number quality, and what use cases you want to serve.
Cheers,
Jeremy
On Thu, Mar 13, 2025 at 18:32 Frederick Virchanza Gotham via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Thu, Mar 13, 2025 at 7:06 PM Jeremy Rifkin wrote:
> >
> > > Maybe we could make it:
> > > std::array< char unsigned, 128u / CHAR_BIT >
> > > or maybe even:
> > > std::array< char unsigned, (128u / CHAR_BIT) + !!(128u % CHAR_BIT) >
> > > but also provide an extra constexpr function to convert the array to
> > > __uint128_t (where a 128-Bit integer type exists).
> >
> > But what if it doesn’t exist? Then you haven’t solved the fundamental
> problem with this interface.
>
>
>
> The main type would be:
>
> typedef std::array< char unsigned, (128u / CHAR_BIT) + !!(128u %
> CHAR_BIT) > uuid_t;
>
> And then we'd provide a constexpr function for the platforms that have
> a 128-Bit integer type:
>
> #ifdef __GNUC__
> constexpr __uint128_t to_uint128(uuid_t const &);
> #endif
>
> although maybe std::bit_cast would be an alternative here.
>
>
>
> > 128 random bits can provide *at most* 64 bits of collision resistance
> > thanks to the birthday paradox. 2^64 is large, but not that large.
>
>
>
> For us mere mortals here on Earth, it's accepted that 128 bits is
> enough. There might be a species somewhere else in the universe that
> have spread to multiple galaxies; I mean the universe is 13.8 billion
> years old, and life started on Earth 3.5 billion years ago . . . so
> maybe if a species started out 9 billion years ago and had billions of
> years to spread outside their solar system, around their galaxy and
> into neighbouring galaxies. Maybe that multigalactic species needs 256
> bits. But for the rudimentary stuff we do here on Earth, we can
> populate a database with 128-Bit randomly-generated identifiers and be
> sure that we won't get a collision even if we combine all the world's
> databases together into one.
>
> Sorry I'm watching 'Signs' with Mel Gibson tonight. What a film. Up
> there with the Sixth Sense.
>
> But yeah 128-Bit is enough bits for humans to say "We won't get a
> collision ever". That's why a UUID is 128-Bit.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
Received on 2025-03-14 04:54:22