Date: Fri, 19 Jan 2024 18:43:48 +0000
That's an interesting opinion. Are you saying that it's generally more
desirable for the function to perform 2*n + log(n) steps than n + 2*log(n),
and that the latter option should be reserved to niche scenarios?
On Fri, 19 Jan 2024 at 18:26, Jason McKesson via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Fri, Jan 19, 2024 at 12:19 PM Iúri Chaer via Std-Proposals
> <std-proposals_at_[hidden]> wrote:
> >
> > The current complexity guarantee for std::lower_bound states "At most
> log(last - first)+O(1) comparisons and projections" for all iterator types,
> pretty much enforcing the use of vanilla binary search, even if with
> non-random-access iterators that means 2*n iterator mutations. I've
> recently had a libc++ proposal for using one-sided binary search to reduce
> that cost to n iterator mutations rejected (
> https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),
> because it meant that we'd have at most 2*log(n) comparator calls. I have a
> strong feeling that this is not what the committee had intended.
> >
> > The solution from that proposal would have had a potentially interesting
> impact even for random-access iterators, because, although the average
> complexity is log(n) comparisons, it ranges from a single comparison to
> 2*log(n), and you can do a sequential scan in O(n) instead of the
> O(n*log(n)) that vanilla binary search would yield. It's not completely
> clear to me which would be more interesting in general, perhaps allowing
> implementors to decide even for random access iterators would be a good
> thing.
> >
> > What do you think of the idea of loosening the std::lower_bound
> comparator complexity requirements for non-random iterators to allow
> changes like the one I proposed?
>
> I feel like reducing the potential number of iterations for
> non-random-access iterators at the expense of potentially increasing
> the projection/comparison count is something that should be relegated
> to a different function. If you want a one-sided binary search, that's
> something you should ask for. It shouldn't be imposed upon you by an
> implementation which thinks it knows better in some scenario.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
desirable for the function to perform 2*n + log(n) steps than n + 2*log(n),
and that the latter option should be reserved to niche scenarios?
On Fri, 19 Jan 2024 at 18:26, Jason McKesson via Std-Proposals <
std-proposals_at_[hidden]> wrote:
> On Fri, Jan 19, 2024 at 12:19 PM Iúri Chaer via Std-Proposals
> <std-proposals_at_[hidden]> wrote:
> >
> > The current complexity guarantee for std::lower_bound states "At most
> log(last - first)+O(1) comparisons and projections" for all iterator types,
> pretty much enforcing the use of vanilla binary search, even if with
> non-random-access iterators that means 2*n iterator mutations. I've
> recently had a libc++ proposal for using one-sided binary search to reduce
> that cost to n iterator mutations rejected (
> https://github.com/llvm/llvm-project/pull/75230#issuecomment-1852637040),
> because it meant that we'd have at most 2*log(n) comparator calls. I have a
> strong feeling that this is not what the committee had intended.
> >
> > The solution from that proposal would have had a potentially interesting
> impact even for random-access iterators, because, although the average
> complexity is log(n) comparisons, it ranges from a single comparison to
> 2*log(n), and you can do a sequential scan in O(n) instead of the
> O(n*log(n)) that vanilla binary search would yield. It's not completely
> clear to me which would be more interesting in general, perhaps allowing
> implementors to decide even for random access iterators would be a good
> thing.
> >
> > What do you think of the idea of loosening the std::lower_bound
> comparator complexity requirements for non-random iterators to allow
> changes like the one I proposed?
>
> I feel like reducing the potential number of iterations for
> non-random-access iterators at the expense of potentially increasing
> the projection/comparison count is something that should be relegated
> to a different function. If you want a one-sided binary search, that's
> something you should ask for. It shouldn't be imposed upon you by an
> implementation which thinks it knows better in some scenario.
> --
> Std-Proposals mailing list
> Std-Proposals_at_[hidden]
> https://lists.isocpp.org/mailman/listinfo.cgi/std-proposals
>
Received on 2024-01-19 18:44:02