Understanding the Properties of Dot Product: Is it Truly Associative?

Click For Summary

Discussion Overview

The discussion revolves around the properties of the dot product, specifically questioning its associativity. Participants explore mathematical representations and implications of the dot product in relation to vector operations, including the outer product and matrix multiplication. The scope includes theoretical and mathematical reasoning.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant cites a source stating that the dot product is not associative but suggests a relationship involving matrix multiplication.
  • Another participant clarifies the distinction between the dot product (a scalar) and the outer product (a matrix), emphasizing their different properties.
  • A participant presents specific vectors and calculations to argue that the two expressions involving the dot product and outer product are not generally equal.
  • Further clarification is provided regarding the matrix representation of the outer product.
  • One participant acknowledges a misunderstanding regarding the representation of vectors as row vectors, which contributed to their confusion.
  • A more abstract discussion is introduced about the dual space of finite-dimensional vector spaces and how it relates to the dot product and linear transformations.

Areas of Agreement / Disagreement

Participants express differing views on the associativity of the dot product, with some supporting the idea that it is not associative and others providing mathematical reasoning that challenges this notion. The discussion remains unresolved regarding the implications of these mathematical representations.

Contextual Notes

Participants rely on specific definitions and representations of vectors and operations, which may affect their conclusions. The discussion includes assumptions about the nature of the vectors involved and their dimensionality.

Apteronotus
Messages
201
Reaction score
0
If you look up dot product in http://en.wikipedia.org/wiki/Dot_product" , under 'properties' it states the following:

"The dot product is not associative, however with the help of the matrix-multiplication one can derive:
[tex] \left(\vec{a} \cdot \vec{b}\right) \vec{c} = \left(\vec{c}\vec{b}^{T}\right)\vec{a}[/tex]"​

I simply don't see how this can be true for any vector [tex]\vec{c}[/tex]. Is it?

Thanks in advance,
 
Last edited by a moderator:
Physics news on Phys.org
Hi Apteronotus! :smile:

cTb is a scalar, c.b, but cbT is a matrix.

(cbT)a = ∑∑(cbT)ijajei
= ∑∑cibjajei
= (∑bjaj)∑ciei = (b.a)c :smile:
 
cTb, the dot product, is often called the "inner product" and is a scalar while bcT is called the "outer product" and is a matrix.
 
Thank you both for your replies. But I still think there may be a problem.
Consider three vectors [tex]\vec{a}, \vec{b}[/tex] and [tex]\vec{c}[/tex], where

[tex]\vec{a}=\left[a_{1}, a_{1}, a_{3}\right][/tex]
[tex]\vec{b}=\left[b_{1}, b_{1}, b_{3}\right][/tex]

and
[tex]\vec{c}=\left[c_{1}, c_{1}, c_{3}\right][/tex]

then
[tex] \left(\vec{a}\cdot\vec{b}\right)\vec{c}=\left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)\left[c_{1}, c_{1}, c_{3}\right][/tex]

and

[tex] \left(\vec{c}\vec{b}^{T}\right)\vec{a}=\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)\left[a_{1}, a_{2}, a_{3}\right][/tex]

Now these vectors are not necessarily equal. Consider the first element of each.
It is clear that in general,

[tex] \left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)c_{1}\neq\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)a_{1}[/tex]
 
Apteronotus said:
[tex] \left(\vec{c}\vec{b}^{T}\right)\vec{a}=\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)\left[a_{1}, a_{2}, a_{3}\right][/tex]

But tcbT is not (c1b2+c2b2+c3b3) …

it's a matrix.
 
Specifically, it is the matrix
[tex] cb^T = \begin{bmatrix}<br /> b_1 c_1 & b_2 c_1 & b_3 c_1 \\<br /> b_1 c_2 & b_2 c_2 & b_3 c_2 \\<br /> b_1 c_3 & b_2 c_3 & b_3 c_3<br /> \end{bmatrix}.[/tex]
 
Last edited:
Yes! Thank you both very much.
My error was in taking the vectors as row vectors.
Thanks again,
 
This is more abstract and more advanced than the "inner product" but if you are using the "outer product", you may want to think about this.

The "dual" of a finite dimensional vector space, V, (the space of linear functionals from V to the base field) is isomorphic to v with a "natural" isomorphism: given a basis [itex]{e_1, e_2, \cdot\cdot\cdot, e_n}[/itex], map each basis vector [itex]e_i[/itex] to the functional, [itex]f_{e_i}(v)[/itex] that maps [itex]e_i[/itex] to 1, all other [itex]e_j[/itex] to 0. Then extend it to the entire space by "linearity": if [itex]v= a_1e_1+ a_2e_2+ \cdot\cdot\cdot a_ie_i+ \cdot\cdot\cdot+ a_ne_n[/itex], [itex]f(v)= a_1 f(e_1)+ a_2f(e_2)+\cdot\cdot\cdot+ a_if(e_i)+ \cdot\cdot\cdot+ a_nf(e_n)[/itex][itex]= a_1(0)+ a_2(0)+ \cdot\cdot\cdot+ a_i(1)+ \cdot\cdot\cdot+ a_n(0)= a_i[/itex].

Since that is an isomorphism, given any vector u, that isomorphism maps it to f_u(x). Given any two vectors, u, and v, the functional f_u(v) takes v to the real number [itex]u\cdot v[/itex], their dot product as defined in that particular basis. On the other hand "[itex]v f_v(x)[/itex] can be interpreted as a linear transformation that maps each vector , w, into the vector [itex](f_v(w))u[/itex] an numeric multiple of u. If we agree to write vectors as column matrices, say
[tex]v= \left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right][/tex]
and functionals in the dual space as row matrices, say
[tex]f_u= \left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right][/tex]
Then the operation of the functional, [itex]f_u[/itex] on v is the matrix product
[tex]\left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right]\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right][/tex]
while the linear transformation corresponding to [itex]v f_u[/itex] is give by the matrix product

[tex]\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]\left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right][/tex]
 
Last edited by a moderator:

Similar threads

  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 12 ·
Replies
12
Views
4K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
6K
  • · Replies 7 ·
Replies
7
Views
3K