Mathematical notation for elementwise multiplication

Click For Summary

Discussion Overview

The discussion revolves around the notation for elementwise multiplication of vectors and matrices, particularly in the context of mathematical representation and clarity in communication. Participants explore various ways to express this operation mathematically, including the use of LaTeX code.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant inquires about an established mathematical notation for elementwise multiplication of vectors, referencing MATLAB's syntax.
  • Another participant suggests that multiplication between vectors should be understood as componentwise, raising the question of how to express this clearly in writing.
  • A participant proposes writing column vectors as \boldsymbol{a}=(a_i) and \boldsymbol{b}=(b_i), and suggests that elementwise multiplication could be denoted as \boldsymbol{c}=(a_i b_i), questioning its unambiguity.
  • Some participants express skepticism about the necessity of a specific notation for elementwise multiplication, arguing that it may undermine the definitions of vectors and matrices.
  • One participant discusses the practical application of elementwise multiplication in analyzing data series, providing an example involving economic outputs and growth rates.
  • Another participant introduces the concept of the direct sum in vector spaces and its notation, which leads to further clarification on its meaning.
  • There is a mention of the Hadamard product as a potential related concept, although its established status is questioned.
  • Several participants emphasize that while inner products have accepted notation, the notation for elementwise multiplication is not standard and may require explicit clarification in communication.

Areas of Agreement / Disagreement

Participants express differing views on the necessity and clarity of notation for elementwise multiplication. While some agree on the need for clear communication, others argue that the lack of established notation reflects its limited use in mathematical contexts. The discussion remains unresolved regarding the best approach to denote this operation.

Contextual Notes

Participants note the potential for misinterpretation in the proposed notations and highlight the dependence on context and definitions when discussing vector operations. The discussion also touches on the distinction between elementwise multiplication and inner products, indicating a need for careful language in mathematical writing.

Mårten
Messages
126
Reaction score
1
Hi,

I wonder if anyone knows of a mathematically established way of writing elementwise multiplication between two vectors? In MATLAB you can write A .* B to indicate that you want to multiply the vectors A and B elementwise. In my case, I have two column vectors, A and B, and I want to multiply them elementwise and get the result in a vector C. A Latex code for this, if it exists, would also be appreciated.

/Mårten
 
Physics news on Phys.org
I would just make a note up front that multiplication between vectors should be taken to be componentwise.
 
How do you mean "up front"? You mean that I explain this in the surrounding text? Okey, that's a possibility, but shouldn't there be a way to express this mathematically?

I was figuring that if you can denote a matrix A=(a_{ij}), which I seen in several texts, then it ought to be possible to write a column vector as \boldsymbol{a}=(a_i) and another column vector as \boldsymbol{b}=(b_i). And then element wise multiplication could possibly be written as a new column vector \boldsymbol{c}=(a_i b_i) ?

Is that an unambiguous way to express what I want, or could it be misinterpreted?

Or what do others here think? Some more suggestions?EDIT: 1) I suppose the same should apply for matrices, so for any given pair of i,j, the element wise multiplication of the matrices (a_{ij}) and (b_{ij}) ought to be denoted as (a_{ij}b_{ij}).
2) As I've understood it, (a_{ij}) denote a matrix, and a_{ij} denote an individual element in that matrix. What distinguishes these two ways of writing, is the parentheses in the former case. Please correct me if I'm wrong.
 
Last edited:
There isn't someone who can confirm that what I'm saying above is correct? Or are there any other ways to denote element wise multiplication?
 
There simply isn't enough use for "element wise multiplication" of vectors for it to have a specific notation. That, together with "element wise multiplication" of matrices would pretty much negate the whole point of defining vectors and matrices.
 
Mårten said:
write a column vector as \boldsymbol{a}=(a_i) and another column vector as \boldsymbol{b}=(b_i). And then element wise multiplication could possibly be written as a new column vector \boldsymbol{c}=(a_i b_i) ?

Is that an unambiguous way to express what I want, or could it be misinterpreted?
Sure, this is unambiguous. Since this is heavily basis-dependent, it is not a usual thing to do. If you want to be fancy: given a finite-dimensional vector space V over F and a fixed basis (e1,..,en), we have
V=\bigoplus_{i=1}^n \mathbb{F}e_i

For every j, define the "projection"

P_j:\bigoplus_{i=1}^n \mathbb{F}e_i\to \mathbb{F}
(\lambda_1e_1,..,\lambda_ne_n)\mapsto \lambda_n.

Then your product of the vectors a and b is the vector c which satisfies

P_i(c)=P_i(a)P_i(b).

Of course, this is just what you said.

Or let T_i be the linear map whose matrix (w.r.t. this basis) has all entries zero, except for entry at row i, column i which has a 1. So it acts as

T_i(a)=T_i(a_1,..,a_n)=(0,..0,a_i,0,...,0).

Then c is the vector

c=\sum_{i=1}^n \left<T_i(a),T_i(b)\right>e_i.

where <..,..> denotes the inner product.
 
Last edited:
HallsofIvy said:
That, together with "element wise multiplication" of matrices would pretty much negate the whole point of defining vectors and matrices.
Hm... I'm still a beginner in linear algebra. What would you say is the whole point with defining vectors and matrices then?

I found it pretty common when you deal with different dataseries, that you would like to do elementwise multiplication. For instance, you could have a vector describing the economic output from different industries. Then you have another vector describing the different growth rates for these industries. So to get the new output after the growth, you multiply the vectors elementwise.

Landau said:
Sure, this is unambiguous. Since this is heavily basis-dependent, it is not a usual thing to do. If you want to be fancy: given a finite-dimensional vector space V over F and a fixed basis (e1,..,en), we have
V=\bigoplus_{i=1}^n \mathbb{F}e_i
I haven't seen that plus symbol before, what does it mean?

Anyhow, thanks for your replies, both of you! :smile:
 
Hi Mårten, the symbol means "direct sum". The sum of two subspaces, with underlying sets U and V, is another vector space defined as

U + V = \left \{ u + v : u \in U, v \in V \right \}.

If the intersection U \cap V = \left \{ \textbf{0} \right \}, that is, if U and V have no vectors in common, then the sum is called the direct sum, and can be written U \oplus V. For subspaces with underlying sets V_1, V_2 etc.,

\bigoplus_{i=1}^n V_i := V_1 + V_2 ... V_n
 
Mårten said:
Hm... I'm still a beginner in linear algebra. What would you say is the whole point with defining vectors and matrices then?
Basically, linear combinations. If we define "element wise" multiplication, without the usual addition of those products, we lose the intermingling of different parts.

I found it pretty common when you deal with different dataseries, that you would like to do elementwise multiplication. For instance, you could have a vector describing the economic output from different industries. Then you have another vector describing the different growth rates for these industries. So to get the new output after the growth, you multiply the vectors elementwise.
But then you add those values so what you doing is an "inner product", not just element wise multiplictation.

I haven't seen that plus symbol before, what does it mean?

Anyhow, thanks for your replies, both of you! :smile:
 
  • #10
Rasalhague said:
Hi Mårten, the symbol means "direct sum". The sum of two subspaces, with underlying sets U and V, is another vector space defined as

U + V = \left \{ u + v : u \in U, v \in V \right \}.

If the intersection U \cap V = \left \{ \textbf{0} \right \}, that is, if U and V have no vectors in common, then the sum is called the direct sum, and can be written U \oplus V. For subspaces with underlying sets V_1, V_2 etc.,

\bigoplus_{i=1}^n V_i := V_1 + V_2 ... V_n
Okey, I think I understand now, sort of.

HallsofIvy said:
But then you add those values so what you doing is an "inner product", not just element wise multiplictation.
No, I'm actually not adding those values, I'm looking at them separately, as I am interested in the individual output for all the separate industries, not the sum of the output from all the industries. I would loose information if I did sum them up. If I'm not getting you wrong...
 
  • #11
Last edited by a moderator:
  • #12
Mårten said:
No, I'm actually not adding those values, I'm looking at them separately, as I am interested in the individual output for all the separate industries, not the sum of the output from all the industries. I would loose information if I did sum them up. If I'm not getting you wrong...

So here's the deal: the inner product is very common in mathematical work (and there's accepted notation for it), and what you describe is not (and so there's not established notation). That's not an unusual situation at all--many mathematical articles start out by defining some notation that's useful for the task at hand, but not standard. So if you want to do componentwise multiplication, just say up front that when you write a<whatever>b, that's what you mean. As far as I know, there isn't any notation for this that everyone will recognize without any explanation on your part.
 
  • #13
Okey, I will do something like that.

Thanks all for the replies! :smile:
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
4K