Date: Fri, 19 Jan 2024 15:09:15 -0600
On Fri, Jan 19, 2024, 11:19 AM Iúri Chaer via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> The current complexity guarantee for std::lower_bound states "At most
> log(last - first)+O(1) comparisons and projections" for all iterator types,
> pretty much enforcing the use of vanilla binary search, even if with
> non-random-access iterators that means 2*n iterator mutations. I've
> recently had a libc++ proposal for using one-sided binary search to reduce
> that cost to n iterator mutations rejected (
> https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),
> because it meant that we'd have at most 2*log(n) comparator calls. I have a
> strong feeling that this is not what the committee had intended.
>
> The solution from that proposal would have had a potentially interesting
> impact even for random-access iterators, because, although the average
> complexity is log(n) comparisons, it ranges from a single comparison to
> 2*log(n), and you can do a sequential scan in O(n) instead of the
> O(n*log(n)) that vanilla binary search would yield. It's not completely
> clear to me which would be more interesting in general, perhaps allowing
> implementors to decide even for random access iterators would be a good
> thing.
>
> What do you think of the idea of loosening the std::lower_bound comparator
> complexity requirements for non-random iterators to allow changes like the
> one I proposed?
>
>
> Cheers,
> Iuri
>
This seems to me like it's definitely worth writing a paper for and
discussing. Especially if said paper has benchmarks (for both the random
and non-random cases, and some comparison that's more complicated than
ints... probably strings?)
Thanks,
Barry
>
std-proposals_at_[hidden]> wrote:
> The current complexity guarantee for std::lower_bound states "At most
> log(last - first)+O(1) comparisons and projections" for all iterator types,
> pretty much enforcing the use of vanilla binary search, even if with
> non-random-access iterators that means 2*n iterator mutations. I've
> recently had a libc++ proposal for using one-sided binary search to reduce
> that cost to n iterator mutations rejected (
> https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),
> because it meant that we'd have at most 2*log(n) comparator calls. I have a
> strong feeling that this is not what the committee had intended.
>
> The solution from that proposal would have had a potentially interesting
> impact even for random-access iterators, because, although the average
> complexity is log(n) comparisons, it ranges from a single comparison to
> 2*log(n), and you can do a sequential scan in O(n) instead of the
> O(n*log(n)) that vanilla binary search would yield. It's not completely
> clear to me which would be more interesting in general, perhaps allowing
> implementors to decide even for random access iterators would be a good
> thing.
>
> What do you think of the idea of loosening the std::lower_bound comparator
> complexity requirements for non-random iterators to allow changes like the
> one I proposed?
>
>
> Cheers,
> Iuri
>
This seems to me like it's definitely worth writing a paper for and
discussing. Especially if said paper has benchmarks (for both the random
and non-random cases, and some comparison that's more complicated than
ints... probably strings?)
Thanks,
Barry
>
Received on 2024-01-19 21:09:28