C++ Logo

std-discussion

Advanced search

Re: Setting wording for bit manipulation for non-binary hardware

From: Andrey Semashev <andrey.semashev_at_[hidden]>
Date: Sun, 7 Mar 2021 12:13:54 +0300
On 3/6/21 10:34 PM, Jason McKesson via Std-Discussion wrote:
> On Sat, Mar 6, 2021 at 12:55 PM Vishal Oza via Std-Discussion
> <std-discussion_at_[hidden]> wrote:
>>
>> I was thinking of making the bit and bitset libraries as well bit manipulation operators with the exception of endian and possibly left and right bitshift to undefined behavior if the base is not 2 in hardware.
>
> Um, what exactly does that mean?
>
> Here's what I mean. If you have the number 15, and you do a
> binary-left-shift it by 2, the number you get is 30. It *does not
> matter* how the hardware implements this. So long as `30 == 15 << 2`
> is true, the implementation is doing its job.
>
> Similarly, if you have the number 15, and you do a bitwise-and with 2,
> you get the number 13. Again, how the hardware does this is
> irrelevant; the implementation's job is to make the math work out.

Bit manipulation operators affect bit representation of a number, the
resulting value is a consequence of that. In that sense, before we
mandated two's complement representation, applying bit operators to
signed integers would yield implementation-dependent numbers, although
the effect on bit representation (with some restrictions) was well defined.

When the "bit" is not binary, the translation from "bit" representation
to number is, again, different. Arguably, bit manipulation would have to
operate on those non-binary "bits" still, or become unapplicable and
removed (e.g. bitwise AND/OR/XOR/NOT operators), but their result would
translate to different numbers than on a binary machine.

IMHO, too much of the standard C/C++ will have to be thrown out by this.
Todays software at all levels relies heavily on binary representations,
including standards like POSIX (think of various flags in any API). Most
communication protocols, including IP, are also built around binary
representation. My suggestion to the OP is, unless this is a purely
academic interest, think of designing a co-processor or accelerator that
runs alongside the traditional binary CPU. That way you would be able to
develop your own language (which could be a cut down and adapted version
of C++) for the programs for the co-processor while still benefitting
from the existing infrastructure running on the traditional CPU.

Received on 2021-03-07 03:13:59