Date: Fri, 19 Jan 2024 17:19:32 +0000

The current complexity guarantee for std::lower_bound states "At most

log(last - first)+O(1) comparisons and projections" for all iterator types,

pretty much enforcing the use of vanilla binary search, even if with

non-random-access iterators that means 2*n iterator mutations. I've

recently had a libc++ proposal for using one-sided binary search to reduce

that cost to n iterator mutations rejected (

https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),

because it meant that we'd have at most 2*log(n) comparator calls. I have a

strong feeling that this is not what the committee had intended.

The solution from that proposal would have had a potentially interesting

impact even for random-access iterators, because, although the average

complexity is log(n) comparisons, it ranges from a single comparison to

2*log(n), and you can do a sequential scan in O(n) instead of the

O(n*log(n)) that vanilla binary search would yield. It's not completely

clear to me which would be more interesting in general, perhaps allowing

implementors to decide even for random access iterators would be a good

thing.

What do you think of the idea of loosening the std::lower_bound comparator

complexity requirements for non-random iterators to allow changes like the

one I proposed?

Cheers,

Iuri

log(last - first)+O(1) comparisons and projections" for all iterator types,

pretty much enforcing the use of vanilla binary search, even if with

non-random-access iterators that means 2*n iterator mutations. I've

recently had a libc++ proposal for using one-sided binary search to reduce

that cost to n iterator mutations rejected (

https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),

because it meant that we'd have at most 2*log(n) comparator calls. I have a

strong feeling that this is not what the committee had intended.

The solution from that proposal would have had a potentially interesting

impact even for random-access iterators, because, although the average

complexity is log(n) comparisons, it ranges from a single comparison to

2*log(n), and you can do a sequential scan in O(n) instead of the

O(n*log(n)) that vanilla binary search would yield. It's not completely

clear to me which would be more interesting in general, perhaps allowing

implementors to decide even for random access iterators would be a good

thing.

What do you think of the idea of loosening the std::lower_bound comparator

complexity requirements for non-random iterators to allow changes like the

one I proposed?

Cheers,

Iuri

Received on 2024-01-19 17:19:46