C++ Logo

std-discussion

Advanced search

Re: Hostile implementation of strict total order of pointers

From: Nate Eldredge <nate_at_[hidden]>
Date: Sun, 23 Jul 2023 11:15:27 -0600 (MDT)
On Sun, 23 Jul 2023, Edward Catmur via Std-Discussion wrote:

> On Sun, Jul 23, 2023, 18:27 Richard Hodges <hodges.r_at_[hidden]> wrote:

>> The ambiguous and convoluted memory model in C++ was standardised because
>> at the time the majority of software was written on 80x86 architectures,
>> which had a choice of 4 (from memory) memory models. From a high level
>> programming perspective, two of them were ridiculous.

(It was six, actually.)

>> I am simply saying that I think that in hindsight, the committee took a
>> wrong turn. Had the memory model been standardised to be flat (i.e. the
>> huge model in 80x86 land) then today we would be able to compare addresses
>> without jumping through hoops,

Let's remember, though, that the huge model came with huge overhead. All
pointer operations would at least double in cost due to needing 32-bit
arithmetic, and perhaps much more when you throw in the necessary shifts
(unreasonably expensive on the 8086) and segment register reloads.

The quirks of the medium/large model were certainly peculiar, but not a
big deal for real programs, since how often do you really need to do
arithmetic on pointers into different arrays? So they were well worth the
savings.

Given the market dominance of x86 in the 1980s, I really can't blame
Stroustrup, nor the ANSI C committee, for accommodating it. I feel like
if C++ had enforced the huge memory model early on, it would not have
become anywhere near as popular as it did, and C++ today would be a
beautiful language that nobody uses.

I suppose that standardization in the late 1990s could have been a time to
make that break with the past, but even then, an awful lot of the software
world was still 16-bit x86.

I don't quite understand the "extension" argument. Having medium/large be
an "extension" would be an odd kind of extension, in that it would break
or alter existing language rules instead of adding to them. I'd say
instead that they would have become a separate dialect of C++, and given
the economics of the time, that dialect might very well have become the
dominant one.

>> If people really wanted to select individual segments, they could have
>> carried on using compiler extensions, which they had to anyway, because
>> near and far (etc) are not C++ keywords.

On a 16-bit x86 compiler, you could build for the medium/large memory
models without using the near/far keywords anywhere in your source code.
It simply meant that all data pointers became implicitly far. You only
needed the near/far keywords if you wanted to deliberately mix memory
models, or interoperate with code using a different one (e.g. raw DOS
system calls that were not wrapped by standard library functions). So a
portable, standard-conformant C++ program would work just fine.

-- 
Nate Eldredge
nate_at_[hidden]

Received on 2023-07-23 17:15:31