C++ Logo

std-proposals

Advanced search

Re: [std-proposals] Loosen complexity guarantees for std::lower_bound

From: Iúri Chaer <iuri.chaer_at_[hidden]>
Date: Fri, 19 Jan 2024 22:17:04 +0000
Apologies Jason, I do hope this won't read patronising, but I feel I must
remind you that "scales better", in the context of algorithms, means that
the number of steps required grows more slowly with the input size. It is
an incontrovertible fact that n + 2*log(n) scales better than 2*n + log(n).
I assume what you have in mind is the fact that the constants involved in
the comparison function may be way larger than the ones involved in
iterator mutation, but that is not relevant to my statement. n + 2*log(n)
scales better in all situations.

I also believe I must have miscommunicated if you understood that I meant
that the complexity guarantees from the standard don't have to be followed
-- they absolutely must be followed. When I argued that they were not
meaningful in practice I meant that the wording is enforcing something
other than the spirit of the guarantees, and that I believe that we'd be
better served by a different wording.

Thanks for your feedback, and I hope we can engage in constructive
discussion in the future!

On Fri, 19 Jan 2024 at 21:28, Jason McKesson via Std-Proposals <
std-proposals_at_[hidden]> wrote:

> On Fri, Jan 19, 2024 at 4:00 PM Iúri Chaer via Std-Proposals
> <std-proposals_at_[hidden]> wrote:
> >
> > Wouldn't that go counter to the design philosophy of the STL?
> lower_bound is not the name of an algorithm, but of the action -- the
> algorithm behind it in all the mainline implementations is the classic
> binary search, but that's not explicit in its name. What I'm proposing is
> loosening the complexity guarantees to allow implementing it with a
> different algorithm which scales better.
>
> "Scales better" in certain situations. That's why it ought to be a
> different function; so that people can pick what they want.
>
> > Less importantly, but still relevant and slightly aggravating, the
> specific algorithm I was proposing has the same average complexity in terms
> of comparisons as classic binary search. The complexity guarantee in the
> standard is so strict that neither of these facts -- that it has the same
> average complexity in terms of comparisons, or that it offers better
> overall worst-case complexity -- make it acceptable.
> >
> > The crux of the problem is the fact that the complexity guarantee is
> defined in terms of comparisons, while for non-random-access iterators that
> complexity is dominated by iterator mutations.
> >
> > I would argue that the strictness in terms of constant factors is, in
> itself, not meaningful in practice: there is no enforcement of the level of
> optimisation for C++ compilers, nor of the specific way in which each step
> is implemented by a library, so the constants involved are _already_
> implementation-dependent.
>
> And yet, you couldn't get your change into a standard library
> implementation precisely because they enforced what the standard
> requires. That sounds like the standard is at least someone
> "meaningful in practice".
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>

Received on 2024-01-19 22:17:18