C++ Logo

std-proposals

Advanced search

Re: [std-proposals] solution proposal for Issue 2524: generate_canonical can occasionally return 1.0

From: Juan Lucas Rey <juanlucasrey_at_[hidden]>
Date: Fri, 5 Dec 2025 11:03:23 +0000
Hello,

Here is an example code that shows:

-the original issue reproduced as shown in
https://cplusplus.github.io/LWG/issue2524
- std2::exponential_distribution, using internally
std2::generate_canonical_centered and NOT showing the issue
-other values are the same

I have added a double template parameter "limit" to
"std2::generate_canonical_centered". to allow maximum backwards
compatibility, that value should be close to 1.
setting that value to 0.5 is maybe more elegant, but less backward compatible.





On Thu, 4 Dec 2025 at 19:40, Jonathan Wakely <cxx_at_[hidden]> wrote:
>
>
>
> On Thu, 4 Dec 2025 at 19:25, Juan Lucas Rey <juanlucasrey_at_[hidden]> wrote:
>>
>> sure, but doesn't p0952r2 also breaks compatibility?
>
>
> Yes, that's what the paper says. I assume the authors wanted to be clear about the consequences of the change.
>
>>
>> even up to the
>> point of producing completely different numbers after a simulation is
>> discarded. With this proposal you can choose the "a" boundary
>>
>> mentioned in the paper to make sure that only results that WOULD HAVE
>> returned 1.0 are changed.
>
>
> Doesn't that preserve the statistical flaws of the original algorithm? i.e. non-uniform output, due to rounding?
> If not, then you're producing different outputs, so would not be compatible with the original std::generate_canonical.
>
> Your proposal says "As a result, many values that would be generated by an [0, 1) generator would remain unchanged." ... how many? If it's not all values, surely that also breaks compatibility? How can you provide higher precision near zero, but produce the same sequence of numbers?
>
> I think a symmetric output does sound interesting, so I'm not opposed to the proposal. I'm just trying to understand it better because you seem to be presenting backwards compatibility as a primary motivation, and I don't see how that works.
>
> If backwards compatibility is a concern, can it be solved as QoI by implementations providing a macro to choose between the old and new specifications for std::generate_canonical?
>
>
>>
>> On Thu, 4 Dec 2025 at 19:21, Jonathan Wakely <cxx_at_[hidden]> wrote:
>> >
>> >
>> >
>> > On Thu, 4 Dec 2025 at 19:06, Juan Lucas Rey via Std-Proposals <std-proposals_at_[hidden]> wrote:
>> >>
>> >> in the current proposal,
>> >> https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2023/p0952r2.html
>> >
>> >
>> > That's not a current proposal, it was already approved by WG21 and added to the C++26 working draft in early 2024.
>> >
>> >>
>> >>
>> >>
>> >> it is mentioned "In particular, code that depends on a specific
>> >> sequence of results from repeated invocations, or on a particular
>> >> number of calls to the URBG argument, will be broken."
>> >>
>> >> This solution avoids breaking this.
>> >
>> >
>> > I don't see how you can avoid breaking compatibility by introducing a new function. I also don't see how you can avoid breaking compatibility for *any* change to the algorithm specified for std::generate_canonical in previous standards Unless you generate exactly the same sequence of numbers, then you break compatibility with the old version. And if you don't break compatibility with the old version, then you can produce 1.0.
>> >
>> > In particular, if you produce a completely different range of outputs, then you can't preserve compatibility with the original spec for std::generate_canonical.
>> >

Received on 2025-12-05 11:03:36