# Dot Product properties

1. Dec 11, 2008

### Apteronotus

If you look up dot product in http://en.wikipedia.org/wiki/Dot_product" [Broken], under 'properties' it states the following:

"The dot product is not associative, however with the help of the matrix-multiplication one can derive:
$$\left(\vec{a} \cdot \vec{b}\right) \vec{c} = \left(\vec{c}\vec{b}^{T}\right)\vec{a}$$"​

I simply dont see how this can be true for any vector $$\vec{c}$$. Is it?

Last edited by a moderator: May 3, 2017
2. Dec 11, 2008

### tiny-tim

Hi Apteronotus!

cTb is a scalar, c.b, but cbT is a matrix.

(cbT)a = ∑∑(cbT)ijajei
= ∑∑cibjajei
= (∑bjaj)∑ciei = (b.a)c

3. Dec 11, 2008

### HallsofIvy

cTb, the dot product, is often called the "inner product" and is a scalar while bcT is called the "outer product" and is a matrix.

4. Dec 22, 2008

### Apteronotus

Thank you both for your replies. But I still think there may be a problem.
Consider three vectors $$\vec{a}, \vec{b}$$ and $$\vec{c}$$, where

$$\vec{a}=\left[a_{1}, a_{1}, a_{3}\right]$$
$$\vec{b}=\left[b_{1}, b_{1}, b_{3}\right]$$

and
$$\vec{c}=\left[c_{1}, c_{1}, c_{3}\right]$$

then
$$\left(\vec{a}\cdot\vec{b}\right)\vec{c}=\left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)\left[c_{1}, c_{1}, c_{3}\right]$$

and

$$\left(\vec{c}\vec{b}^{T}\right)\vec{a}=\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)\left[a_{1}, a_{2}, a_{3}\right]$$

Now these vectors are not necessarily equal. Consider the first element of each.
It is clear that in general,

$$\left(a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}\right)c_{1}\neq\left(c_{1}b_{1}+c_{2}b_{2}+c_{3}b_{3}\right)a_{1}$$

5. Dec 22, 2008

### tiny-tim

But tcbT is not (c1b2+c2b2+c3b3) …

it's a matrix.

6. Dec 23, 2008

Specifically, it is the matrix
$$cb^T = \begin{bmatrix} b_1 c_1 & b_2 c_1 & b_3 c_1 \\ b_1 c_2 & b_2 c_2 & b_3 c_2 \\ b_1 c_3 & b_2 c_3 & b_3 c_3 \end{bmatrix}.$$

Last edited: Dec 23, 2008
7. Dec 25, 2008

### Apteronotus

Yes! Thank you both very much.
My error was in taking the vectors as row vectors.
Thanks again,

8. Dec 27, 2008

### HallsofIvy

This is more abstract and more advanced than the "inner product" but if you are using the "outer product", you may want to think about this.

The "dual" of a finite dimensional vector space, V, (the space of linear functionals from V to the base field) is isomorphic to v with a "natural" isomorphism: given a basis ${e_1, e_2, \cdot\cdot\cdot, e_n}$, map each basis vector $e_i$ to the functional, $f_{e_i}(v)$ that maps $e_i$ to 1, all other $e_j$ to 0. Then extend it to the entire space by "linearity": if $v= a_1e_1+ a_2e_2+ \cdot\cdot\cdot a_ie_i+ \cdot\cdot\cdot+ a_ne_n$, $f(v)= a_1 f(e_1)+ a_2f(e_2)+\cdot\cdot\cdot+ a_if(e_i)+ \cdot\cdot\cdot+ a_nf(e_n)$$= a_1(0)+ a_2(0)+ \cdot\cdot\cdot+ a_i(1)+ \cdot\cdot\cdot+ a_n(0)= a_i$.

Since that is an isomorphism, given any vector u, that isomorphism maps it to f_u(x). Given any two vectors, u, and v, the functional f_u(v) takes v to the real number $u\cdot v$, their dot product as defined in that particular basis. On the other hand "$v f_v(x)$ can be interpreted as a linear transformation that maps each vector , w, into the vector $(f_v(w))u$ an numeric multiple of u. If we agree to write vectors as column matrices, say
$$v= \left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]$$
and functionals in the dual space as row matrices, say
$$f_u= \left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right]$$
Then the operation of the functional, $f_u$ on v is the matrix product
$$\left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right]\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]$$
while the linear transformation corresponding to $v f_u$ is give by the matrix product

$$\left[\begin{array}{c}a_1 \\ a_2\\ \cdot \\ \cdot \\ \cdot \\ a_n\end{array}\right]\left[\begin{array}{ccccc}b_1 & b_2 & \cdot\cdot\cdot & b_n\end{array}\right]$$

Last edited by a moderator: Dec 27, 2008