On Mon, Oct 11, 2021, 21:01 Jason McKesson via Std-Discussion <std-discussion@lists.isocpp.org> wrote:
On Mon, Oct 11, 2021 at 1:59 PM Gennaro Prota via Std-Discussion
<std-discussion@lists.isocpp.org> wrote:
>
>
>
>
> On Mon, Oct 11, 2021, 08:54 Anubhav Guleria via Std-Discussion <std-discussion@lists.isocpp.org> wrote:
>>
>> Thanks for clarifying.
>>
>> Any specific reason why container's pop_back method can't check for
>> current size and if it is 0 then make the pop operation a no-op?
>
>
> You'll find that C and C++ people are, with very few exceptions, fixated with micro-optimizations (a known anti-pattern).

Point of order:

Micro-optimization as an anti-pattern is only a legitimate argument
when you're talking about an application *as a whole* (or at the very
least, large-scale systems that have relatively minimal interaction
with the outside world). If you're writing an application, you should
concern yourself with the application's performance as a whole.

Libraries are a different story, *especially* low-level utility
libraries. Such libraries have two properties that matter in this
context: they solve very specific problems, and if they cause a
performance issue in the application that uses it, the maker of that
application *cannot fix it*.

When you're writing a piece of software for a specific purpose within
a specific application, avoiding micro-optimizations makes sense. When
you're writing a tool for other people to use, and you cannot predict
which performance issues matter to which ones of your users, then you
need to *assume* that these optimizations matter. Because if you don't
make that assumption, and you're *wrong* for a particular user... they
can't fix the problem *you* created.

Not without ditching your library and building their own. Which means
that your library failed at its job.

It's the difference between worrying about the weight of a hammer and
worrying about the weight of the shovel at the end of the excavator. A
few grams of weight on a hammer can matter a lot for how many times
you can swing it before it gets tiring. But a few grams in a shovel is
irrelevant to the fuel performance of an excavator.

Makers of low-level tools need to think about things that makers of
high-level tools don't. And vice-versa. It is a mistake to apply rules
meant for one use case to the other.

A classical argument I've heard many times. But it's an argument of immaturity, in my book. Paraphrasing Alan Perlis, if someone says he wants to make their library suitable for every application, give them a lollipop.

The point is that "optimization", as they call it, comes almost always at the expense of the most important factors, i.e. robustness and maintainability. If you can get speed without impacting those, or development time and cost, then fine. Otherwise, I call it "pessimization".

So, serve the majority of users with a robust and responsible tool, leaving those with special needs handle them on their own, rather than serving the former badly for the sake of the few of the latter.

(But I'm not interested in a long discussion about this. This is something psychological, and convincing those who hold such positions to the contrary is extremely difficult, even if the evidence of the error-ridden software they produce should actually convince them without external intervention, if only they asked questions about the why's and were honest with themselves.)

--
Gennaro Prota
https://about.me/gennaro