Understanding the Properties of Dot Product: Is it Truly Associative?

Apteronotus
Messages
201
Reaction score
0
If you look up dot product in http://en.wikipedia.org/wiki/Dot_product" , under 'properties' it states the following:

"The dot product is not associative, however with the help of the matrix-multiplication one can derive:
<br /> \left(\vec{a} \cdot \vec{b}\right) \vec{c} = \left(\vec{c}\vec{b}^{T}\right)\vec{a}<br />"​

I simply don't see how this can be true for any vector \vec{c}. Is it?

Thanks in advance,
 
Last edited by a moderator:
Physics news on Phys.org
Hi Apteronotus! :smile:

cTb is a scalar, c.b, but cbT is a matrix.

(cbT)a = ∑∑(cbT)ijajei
= ∑∑cibjajei
= (∑bjaj)∑ciei = (b.a)c :smile:
 
cTb, the dot product, is often called the "inner product" and is a scalar while bcT is called the "outer product" and is a matrix.
 
Thank you both for your replies. But I still think there may be a problem.
Consider three vectors \vec{a}, \vec{b} and \vec{c}, where

\vec{a}=\left[a_{1}, a_{1}, a_{3}\right]
\vec{b}=\left[b_{1}, b_{1}, b_{3}\right]

and
\vec{c}=\left[c_{1}, c_{1}, c_{3}\right]

then
<br /> \left(\vec{a}\cdot\vec{b}\right)\vec{c}=\left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)\left[c_{1}, c_{1}, c_{3}\right]<br />

and

<br /> \left(\vec{c}\vec{b}^{T}\right)\vec{a}=\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)\left[a_{1}, a_{2}, a_{3}\right]<br />

Now these vectors are not necessarily equal. Consider the first element of each.
It is clear that in general,

<br /> \left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)c_{1}\neq\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)a_{1}<br />
 
Apteronotus said:
<br /> \left(\vec{c}\vec{b}^{T}\right)\vec{a}=\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)\left[a_{1}, a_{2}, a_{3}\right]

But tcbT is not (c1b2+c2b2+c3b3) …

it's a matrix.
 
Specifically, it is the matrix
<br /> cb^T = \begin{bmatrix}<br /> b_1 c_1 &amp; b_2 c_1 &amp; b_3 c_1 \\<br /> b_1 c_2 &amp; b_2 c_2 &amp; b_3 c_2 \\<br /> b_1 c_3 &amp; b_2 c_3 &amp; b_3 c_3<br /> \end{bmatrix}.<br />
 
Last edited:
Yes! Thank you both very much.
My error was in taking the vectors as row vectors.
Thanks again,
 
This is more abstract and more advanced than the "inner product" but if you are using the "outer product", you may want to think about this.

The "dual" of a finite dimensional vector space, V, (the space of linear functionals from V to the base field) is isomorphic to v with a "natural" isomorphism: given a basis {e_1, e_2, \cdot\cdot\cdot, e_n}, map each basis vector e_i to the functional, f_{e_i}(v) that maps e_i to 1, all other e_j to 0. Then extend it to the entire space by "linearity": if v= a_1e_1+ a_2e_2+ \cdot\cdot\cdot a_ie_i+ \cdot\cdot\cdot+ a_ne_n, f(v)= a_1 f(e_1)+ a_2f(e_2)+\cdot\cdot\cdot+ a_if(e_i)+ \cdot\cdot\cdot+ a_nf(e_n)= a_1(0)+ a_2(0)+ \cdot\cdot\cdot+ a_i(1)+ \cdot\cdot\cdot+ a_n(0)= a_i.

Since that is an isomorphism, given any vector u, that isomorphism maps it to f_u(x). Given any two vectors, u, and v, the functional f_u(v) takes v to the real number u\cdot v, their dot product as defined in that particular basis. On the other hand "v f_v(x) can be interpreted as a linear transformation that maps each vector , w, into the vector (f_v(w))u an numeric multiple of u. If we agree to write vectors as column matrices, say
v= \left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]
and functionals in the dual space as row matrices, say
f_u= \left[\begin{array}{ccccc}b_1 &amp; b_2 &amp; \cdot\cdot\cdot &amp; b_n\end{array}\right]
Then the operation of the functional, f_u on v is the matrix product
\left[\begin{array}{ccccc}b_1 &amp; b_2 &amp; \cdot\cdot\cdot &amp; b_n\end{array}\right]\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]
while the linear transformation corresponding to v f_u is give by the matrix product

\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]\left[\begin{array}{ccccc}b_1 &amp; b_2 &amp; \cdot\cdot\cdot &amp; b_n\end{array}\right]
 
Last edited by a moderator:
Back
Top