I agree strongly with Jake’s feedback. My experience in control theory and signal processing is also that vector orientation is very important, and that when implementing algorithms the type checking of row vs. column errors has helped to catch bugs in the implementation at compile time.
To me, orientation is important: a row vector and a column vector are very different things, with different properties.
There's a practical reason for this, rather than just a philosophical one. How you find the dual of a vector depends on the vector space: for real-valued vectors, the transpose is used - which is nice and cheap.
For complex valued vectors, at least in the domains of quantum mechanics, control theory and signal processing, the conjugate transpose is used. Otherwise, you get weird inner product behaviour, such as a self-inner-product having imaginary components. This is not cheap - it's either O(N) at the point of transpose or it requires some fun logic down the line that differs in behaviour depending on whether a parameter is transposed or not. I regret that I am not intimately familiar with the linear algebra proposal, so I don't know if this is already something that is covered.
My concern is that implicitly transposing the wrong way is a source of avoidable error, and for me that trumps any argument about tedium.
With my physics hat on, consider this a vote for orientation matters, unless someone points out that I'm being a fool and that this is all already covered.
On Mon, 1 May 2023, 12:55 Guy Davidson via Lib-Ext, <email@example.com> wrote:
I'm just putting together a first pass at the wording for P1385, A proposal to add linear algebra support to the C++ standard library. If you look at the latest revision you will infer that at Kona in November and at Issaquah in February I addressed SG6 and LEWG about withdrawing the vector class entirely and simply offering a matrix class, where a vector is a special case of a matrix, with a single row or column. There were no objections to this approach.
While there were no objections raised in the meeting, others have come in, and I want to use the reflectors to gather opinion about the matter. The heart of the problem is: what does the vector product signify? Is it an inner or outer product? Is vector orientation significant?
With my mathematician's hat on, multiplying a row vector by a column vector is an inner product, yielding a scalar value if both vectors have the same number of elements. Appearing much more rarely, multiplying a column vector by a row vector is an outer product yielding a square matrix.
However, in the domains where I most make use of linear algebra, what is more significant is that the vector-matrix product treats the vector as a row vector, the matrix-vector product treats the vector as a column vector, and the vector-vector product treats the operation as an inner product. Orientation is irrelevant, and transposing vectors is a tedious waste of time.
The proposal is, under the hood, a way of bringing together element types, element conversions, extents and kernels into a single type, borrowing parts from mdspan. All of these items are template parameters. My current intent is to restore row and column vectors to the proposal, with additional motivation, and supply Orientation-Significant Kernels. I intend to offer Orientation-Agnostic Kernels in a subsequent paper (unless someone beats me to it, which you are all really, REALLY welcome to do) to enable this use-case as standard, allowing users to decide whether orientation is significant. Example code to follow in the fullness of time.
Speaking of which, time is tight: I have a fortnight until the Varna paper deadline, so I would appreciate any commentary sooner rather than later.
Lib-Ext mailing list
Searchable archives: http://lists.isocpp.org/lib-ext/2023/05/index.php