Date: Mon, 1 May 2023 21:02:31 +0100
Hi Guy,
To me, orientation is important: a row vector and a column vector are very
different things, with different properties.
There's a practical reason for this, rather than just a philosophical one.
How you find the dual of a vector depends on the vector space: for
real-valued vectors, the transpose is used - which is nice and cheap.
For complex valued vectors, at least in the domains of quantum mechanics,
control theory and signal processing, the conjugate transpose is used.
Otherwise, you get weird inner product behaviour, such as a
self-inner-product having imaginary components. This is not cheap - it's
either O(N) at the point of transpose or it requires some fun logic down
the line that differs in behaviour depending on whether a parameter is
transposed or not. I regret that I am not intimately familiar with the
linear algebra proposal, so I don't know if this is already something that
is covered.
My concern is that implicitly transposing the wrong way is a source of
avoidable error, and for me that trumps any argument about tedium.
With my physics hat on, consider this a vote for orientation matters,
unless someone points out that I'm being a fool and that this is all
already covered.
Kind regards,
Jake
On Mon, 1 May 2023, 12:55 Guy Davidson via Lib-Ext, <
lib-ext_at_[hidden]> wrote:
> Hello everyone
>
> I'm just putting together a first pass at the wording for P1385, A
> proposal to add linear algebra support to the C++ standard library
> <https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2022/p1385r7.pdf>.
> If you look at the latest revision you will infer that at Kona in November
> and at Issaquah in February I addressed SG6 and LEWG about withdrawing the
> vector class entirely and simply offering a matrix class, where a vector is
> a special case of a matrix, with a single row or column. There were no
> objections to this approach.
>
> While there were no objections raised in the meeting, others have come in,
> and I want to use the reflectors to gather opinion about the matter. The
> heart of the problem is: what does the vector product signify? Is it an
> inner or outer product? Is vector orientation significant?
>
> With my mathematician's hat on, multiplying a row vector by a column
> vector is an inner product, yielding a scalar value if both vectors have
> the same number of elements. Appearing much more rarely, multiplying a
> column vector by a row vector is an outer product yielding a square matrix.
>
> However, in the domains where I most make use of linear algebra, what is
> more significant is that the vector-matrix product treats the vector as a
> row vector, the matrix-vector product treats the vector as a column vector,
> and the vector-vector product treats the operation as an inner product.
> Orientation is irrelevant, and transposing vectors is a tedious waste of
> time.
>
> The proposal is, under the hood, a way of bringing together element types,
> element conversions, extents and kernels into a single type, borrowing
> parts from mdspan. All of these items are template parameters. My current
> intent is to restore row and column vectors to the proposal, with
> additional motivation, and supply Orientation-Significant Kernels. I intend
> to offer Orientation-Agnostic Kernels in a subsequent paper (unless someone
> beats me to it, which you are all really, REALLY welcome to do) to enable
> this use-case as standard, allowing users to decide whether orientation is
> significant. Example code to follow in the fullness of time.
>
> Speaking of which, time is tight: I have a fortnight until the Varna paper
> deadline, so I would appreciate any commentary sooner rather than later.
>
> Cheers,
> G
> _______________________________________________
> Lib-Ext mailing list
> Lib-Ext_at_[hidden]
> Subscription: https://lists.isocpp.org/mailman/listinfo.cgi/lib-ext
> Searchable archives: http://lists.isocpp.org/lib-ext/2023/05/index.php
>
To me, orientation is important: a row vector and a column vector are very
different things, with different properties.
There's a practical reason for this, rather than just a philosophical one.
How you find the dual of a vector depends on the vector space: for
real-valued vectors, the transpose is used - which is nice and cheap.
For complex valued vectors, at least in the domains of quantum mechanics,
control theory and signal processing, the conjugate transpose is used.
Otherwise, you get weird inner product behaviour, such as a
self-inner-product having imaginary components. This is not cheap - it's
either O(N) at the point of transpose or it requires some fun logic down
the line that differs in behaviour depending on whether a parameter is
transposed or not. I regret that I am not intimately familiar with the
linear algebra proposal, so I don't know if this is already something that
is covered.
My concern is that implicitly transposing the wrong way is a source of
avoidable error, and for me that trumps any argument about tedium.
With my physics hat on, consider this a vote for orientation matters,
unless someone points out that I'm being a fool and that this is all
already covered.
Kind regards,
Jake
On Mon, 1 May 2023, 12:55 Guy Davidson via Lib-Ext, <
lib-ext_at_[hidden]> wrote:
> Hello everyone
>
> I'm just putting together a first pass at the wording for P1385, A
> proposal to add linear algebra support to the C++ standard library
> <https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2022/p1385r7.pdf>.
> If you look at the latest revision you will infer that at Kona in November
> and at Issaquah in February I addressed SG6 and LEWG about withdrawing the
> vector class entirely and simply offering a matrix class, where a vector is
> a special case of a matrix, with a single row or column. There were no
> objections to this approach.
>
> While there were no objections raised in the meeting, others have come in,
> and I want to use the reflectors to gather opinion about the matter. The
> heart of the problem is: what does the vector product signify? Is it an
> inner or outer product? Is vector orientation significant?
>
> With my mathematician's hat on, multiplying a row vector by a column
> vector is an inner product, yielding a scalar value if both vectors have
> the same number of elements. Appearing much more rarely, multiplying a
> column vector by a row vector is an outer product yielding a square matrix.
>
> However, in the domains where I most make use of linear algebra, what is
> more significant is that the vector-matrix product treats the vector as a
> row vector, the matrix-vector product treats the vector as a column vector,
> and the vector-vector product treats the operation as an inner product.
> Orientation is irrelevant, and transposing vectors is a tedious waste of
> time.
>
> The proposal is, under the hood, a way of bringing together element types,
> element conversions, extents and kernels into a single type, borrowing
> parts from mdspan. All of these items are template parameters. My current
> intent is to restore row and column vectors to the proposal, with
> additional motivation, and supply Orientation-Significant Kernels. I intend
> to offer Orientation-Agnostic Kernels in a subsequent paper (unless someone
> beats me to it, which you are all really, REALLY welcome to do) to enable
> this use-case as standard, allowing users to decide whether orientation is
> significant. Example code to follow in the fullness of time.
>
> Speaking of which, time is tight: I have a fortnight until the Varna paper
> deadline, so I would appreciate any commentary sooner rather than later.
>
> Cheers,
> G
> _______________________________________________
> Lib-Ext mailing list
> Lib-Ext_at_[hidden]
> Subscription: https://lists.isocpp.org/mailman/listinfo.cgi/lib-ext
> Searchable archives: http://lists.isocpp.org/lib-ext/2023/05/index.php
>
Received on 2023-05-01 20:02:45