Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add unconjugated dot product dotu #27677

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Conversation

ranocha
Copy link
Member

@ranocha ranocha commented Jun 20, 2018

This came up in JuliaLang/LinearAlgebra.jl#496 and #8300. Sometimes, it is useful to have an unconjugated dot product, computing the sum of the elementwise product of two vectors x and y as in dotu(x, y) = sum(x .* y). As suggested in #27401 (comment), I've added this function in a new PR.

Contrary to dot (cf. #27401), dotu is not defined recursively. Thus, the unconjugated dot product of a vector of matrices and a vector of scalars can be computed, which is often used for Pauli matrices as in

julia> σ = [[0 1; 1 0], [0 -im; im 0], [1 0; 0 -1]]; n = [1, 2, 3]; dotu(σ, n)
2×2 Array{Complex{Int64},2}:
 3+0im   1-2im
 1+2im  -3+0im

@antoine-levitt
Copy link
Contributor

antoine-levitt commented Jun 20, 2018

Sorry to play the devil's advocate, but is this really useful? The cases where it's needed it's not really a dot product (ie in your example, n and σ belong to different spaces), nor even a duality pairing, but simply a sum of containers. Also when reading dotu it looks like the only difference with dot is that it's unconjugated, while it's also non-recursive. Maybe sumproduct is clearer, and closer to the mathematical structure and intent?

(much more useful as complex numbers are concerned, and trickier to implement efficiently - especially now that reinterpret sucks - is real(dot(x,y)). If another variant of dot is added, maybe it should be this one.)

@antoine-levitt
Copy link
Contributor

Conceptually and computationally, σ n is closer to a matrix multiplication (reduction) than a dot product. This pleads in favor of defining *(Vector{Array},Vector) and doing σ*n, although that might be a bit too convoluted.

@ranocha
Copy link
Member Author

ranocha commented Jun 20, 2018

Sorry to play the devil's advocate, but is this really useful? The cases where it's needed it's not really a dot product (ie in your example, n and σ belong to different spaces), nor even a duality pairing, but simply a sum of containers. Also when reading dotu it looks like the only difference with dot is that it's unconjugated, while it's also non-recursive. Maybe sumproduct is clearer, and closer to the mathematical structure and intent?

Thanks for your comments! I've chosen the name dotu because this has been used in JuliaLang/LinearAlgebra.jl#496 and #8300 for a functions with this behaviour and is also used in BLAS. Another example for the application of a functions as dotu has been mentioned in JuliaLang/LinearAlgebra.jl#496. Thus, it might be good to have a function like this.

Another possibility might be to have a function dotu that acts recursively similar to dot but as an unconjugated dot product (symmetric bilinear form) and something as sumproduct as suggested by you with the behaviour of dotu now.

@andyferris
Copy link
Member

My thoughts: I think BLAS is generally really bad at naming operations clearly, so I wouldn’t look there for inspiration ;) I don’t quite think of this as a kind of inner product and it seems we’ve recently settled on dot to mean inner product. The example is more-or-less a tensor contraction and we should develop a convincing story for multilinear algebra over time, but unfortunately I really don’t see dotu as being a part of that story.

Also, given the operation is basically a mapreduce(...), I wonder if we should maybe consider productsum over sumproduct?

As an aside, it would be great if mapreduce(*, +, a, b) actually worked... (no multiple input containers like map).

@ranocha
Copy link
Member Author

ranocha commented Jun 20, 2018

Thanks for your comments. I've opened this PR to provide an opportunity to discuss such a function - sometimes it is easier to discuss some existing implementation and possible extensions/variations. Due to some comments in JuliaLang/LinearAlgebra.jl#496 and #8300, there might be some interest in a function such as dotu, I think. Thus, there might be some possibilities:

  1. There is no such function in Base/LinearAlgebra. It could be somewhere else or people interested in it should implement it on their own.
  2. Such a function as implemented in this PR is added to LinearAlgebra. It could be named dotu, productsum, sumproduct, or something else.
  3. Some general multilinear algebra structures and tensor contractions are added, including the functionality of dotu.
  4. mapreduce is extended to allow multiple iterables as arguments. As @andyferris wrote, it would be really great to have this functionality. I've asked a similar question a few days ago (https://discourse.julialang.org/t/method-of-mapreduce-with-multiple-arguments/11388).

@StefanKarpinski
Copy link
Member

As an aside, it would be great if mapreduce(*, +, a, b) actually worked... (no multiple input containers like map).

That would be lovely! 👍

@oschulz
Copy link
Contributor

oschulz commented Nov 8, 2019

I came across this issue yesterday in the context of JuliaStats/StatsBase.jl#534 and JuliaStats/StatsBase.jl#518. The problem there is that for things like a weighted sum, the recursion in dot is a problem, for example when summing vector-valued elements. A dotu in LinearAlgebra would be very helpful, I believe, since there would likely be optimized CPU and GPU implementations, etc. - packages like StatsBase could just call it and not worry about the internals.

@nalimilan
Copy link
Member

@StefanKarpinski Now that mapreduce allows multiple arguments, would it be acceptable to make mapreduce(*, +, a, b) call optimized BLAS methods when possible? AFAICT that's OK since the documentation for reduce says that the associativity is not guaranteed. But that would allow floating point results to differ slightly depending on the array type. Also BLAS may use tricks that go beyond associativity (FMA...).

@oschulz
Copy link
Contributor

oschulz commented Nov 8, 2019

Now that mapreduce allows multiple arguments, would it be acceptable to make mapreduce(*, +, a, b) call optimized BLAS methods when possible?

I would argue against that - currently, mapreduce enjoys the benefit of increased precision due to pairwise summation. After all that went into that (#199, #4039, ...) it would be sad to give that up again. There's also not necessarily a performance benefit - if one has to use 64-bit instead of 32-bit floats to make the sum stable, one may lose more performance than gained (depending on the application).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
linear algebra Linear algebra
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants