# Dot and Cross Product from Rotation Matrix

1. Oct 7, 2012

### Hetware

I'm just learning this Latex(sic) formatting, so it's not ideal.

I was trying to explore the geometrical significance of the cross product when I happened upon an interesting observation. I've seen things like this before, but never had time to really examine them.

I define two vectors:

$\pmb{A}=A^x\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+A^y\overset{\pmb{{}^{\wedge}}}{\pmb{j}}=A\left(\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}\right)$

$\pmb{B}=B^x\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+B^y\overset{\pmb{{}^{\wedge}}}{\pmb{j}}=B\left(\cos (\beta )\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+\sin (\beta )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}\right)$

Express a new basis with the x axis aligned with the first vector:

$\pmb{\hat{i}}=\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'$

$\pmb{\hat{j}}=\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'$

Write the second vector in terms of the new basis and fiddle with it some:

$\pmb{B}=B^x\left(\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)$
$+B^y\left(\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)$

$=B(\cos (\beta )(\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}')+\sin (\beta )(\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'))$

$=B\left(\cos (\beta )\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\cos (\beta )\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'+\sin (\beta )\sin (\alpha )\pmb{\hat{i}'}+\sin (\beta )\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)$

$=B\left[(\cos (\beta )\cos (\alpha )+\sin (\beta )\sin (\alpha ))\pmb{\hat{i}}'+(\pmb{-}\cos (\beta )\sin (\alpha )+\sin (\beta )\cos (\alpha ))\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right]$

$=B\left[\cos (\beta -\alpha )\pmb{\hat{i}}'+\sin (\beta -\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right]$

$=B^{x'}\overset{\pmb{{}^{\wedge}}}{\pmb{i}}'+B^{y'}\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'$

Notice that the $B^{x'}$ component is just the magnitude of the cross product divided by the magnitude of $\pmb{A}$.

Use my own definition of a "complete product" and see that the first term is the dot product, and the second term is the cross product:

$\pmb{AB}=AB\left[(\cos (\beta )\cos (\alpha )+\sin (\beta )\sin (\alpha ))\pmb{\hat{i}}'+(\pmb{-}\cos (\beta )\sin (\alpha )+\sin (\beta )\cos (\alpha ))\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right]$

$\pmb{AB}=\left.(A^xB^x+A^yB^y\right)\overset{\pmb{{}^{\wedge}}}{\pmb{i}}'+\left.(A^xB^y-A^yB^x\right)\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'$

I believe this works in 3 dimensions. Has anybody seen a development of this line of reasoning regarding vector products?

2. Oct 7, 2012

### Muphrid

You can hat vectors with \hat--e.g. \hat u -> $\hat u$.

You've discovered a product called the geometric product--essentially, at least. The geometric product comes from an extension of vector algebra called geometric algebra, but the geometric product is easy enough to explain on its own. You have an orthonormal basis $i, j$. The geometric product is defined as so:

$$i i = jj = 1, \quad ij = -ji$$

That's it. This captures both the aspects of the dot product (symmetry when the vectors are the same) and the cross product (antisymmetry when the vectors are different).

Now, you might be asking, what is $ij$ then? We call it a bivector, and we say that it represents a plane, just as a vector represents a line. The associativity of the geometric product makes it easy to work with such things.

In geometric algebra, we also work with a "wedge" product instead of the cross product because the cross product doesn't exist in dimensions other than 3 or 7. The geometric product can then be seen as the sum of the dot and wedge products.

$ab = a \cdot b + a \wedge b$ for vectors $a, b$.

What you wrote would be written in terms of geometric algebra as the following:

$$B = B A A^{-1} = (B \cdot A) A^{-1} + (B \wedge A) A^{-1}$$

Geometric algebra makes it reasonable to define the inverse of a vector. $A^{-1} = A/(A\cdot A) = A/A^2$, for instance. By the associativity of the geometric product, we can always multiply a vector by $A A^{-1} = 1$ in Euclidean space and then group by associativity to do as you have done. This is valid in any number of dimensions, too. Two vectors always define a plane, and vectors in a plane can always be decomposed into sines and cosines with respect to some reference direction.

Geometric algebra gives great insights into ideas of vector algebra that usually have to be presented without proof. It's a difficult subject, though, and not something I recommend for the faint of heart. Even without geometric algebra, rest assured you've made a valid observation.

3. Oct 8, 2012

### Hetware

I'm way done thinking for the night, but has this anything to do with quaternions? ii=jj=1,ij=−ji looks familiar. Good night.

4. Oct 8, 2012

### chiro

With regard to quaternions, you have i^2 = j^2 = k^2 = -1 and if you let the scalar part be equal to zero, you get the multiplication of quaternions to give the vector part as the cross product as a special case.

5. Oct 8, 2012

### Muphrid

Indeed, geometric algebra has an intimate connection to complex numbers and quaternions.

Let me use $e_x, e_y, e_z$ as the basis vectors for the rest of this post. It will make it easier for me to draw connections between objects in geometric algebra and the complex number $i$ or the quaternionic imaginaries $i,j,k$ this way.

Consider the 2d plane as a real vector space. In geometric algebra, we can form the object $e_x e_y$. Observe that $(e_x e_y)^2 = e_x e_y e_x e_y = -e_x e_y e_y e_x = -1$. Thus, this unit bivector acts like an imaginary unit, and it's not uncommon to call this object $i$, just like the complex number counterpart. This is a big thing, for it means that all of complex analysis can be rewritten in terms of the real(-valued) geometric algebra and its associated calculus. For instance, take a function $f(x) = u(x) + i v(x)$ and find its vector derivative:

$$\nabla f(x) = \left( e_x \frac{\partial}{\partial x} + e_y \frac{\partial}{\partial y} \right) (u + iv) = e_x \left(\frac{\partial u}{\partial x} - \frac{\partial v}{\partial y} \right) + e_y \left( \frac{\partial u}{\partial y} + \frac{\partial v}{\partial x} \right)$$

This may look familiar--it's the Cauchy-Riemann condition for complex differentiability if we set $\nabla f = 0$. We also know this is an integrability condition, and this is just the tip of the iceberg as far as connecting complex analysis to vector analysis. A further investigation would show that the Cauchy Integral Theorem is just an extension of Stokes' Theorem, for example.

But you asked about quaternions, and let me get closer to that. The identification of $i = e_x e_y$ allows use "complex" exponentials to do rotations. I submit that a vector $a$ goes to its rotated counterpart $\underline R(a)$ by

$$\underline R(a) = \exp(-i \theta/2) a \exp(i \theta/2)$$

where $\exp(-i \theta/2) = \cos (\theta/2) - i \sin(\theta/2)$ is called a rotor. This sandwiching is not strictly necessary in 2d--it can be simplified to $a \exp(i \theta)$ as you'd expect--but it is necessary in 3d. In 3d, there's more than just one bivector which squares to -1. $e_y e_z, e_z e_x$ also act as "imaginary" units. To complete the connection to quaternions, we set

\begin{align*} i &\equiv -e_y e_z \\ j &\equiv -e_z e_x \\ k &\equiv -e_x e_y \end{align*}

We can check that these definitions are consistent with the usual ones for the quaternion algebra:

$$ij = (-e_y e_z) (-e_z e_x) = e_y e_x = - e_x e_y = k$$

The other definitions of quaternion multiplication check as well. In geometric algebra, we often just work directly with bivectors like $e_z e_x$ without giving them special names ($i, j, k$). Another advantage of GA is that we can use this "complex" exponential form (that is, using "rotors") with spherical or cylindrical or otherwise arbitrary basis vectors to do rotations. Additionally, the rotor form of performing rotations makes obvious how to extend this formalism to higher dimensions.

6. Oct 8, 2012

### Hetware

Wow! This is like going home after years of being away and asking about the girlfriend who was the reason you left, only to find out she's still single and talks about you all the time.

I know this stuff is important. It's used in differential geometry, and is pragmatically valuable in 3D graphics. It also has it's own intellectual appeal, but I can't get into it now.

If I recall correctly, the story goes something like: once upon a time there was a high school teacher named Grassmann ... And there was an Irishman on a bridge, or something like that.

7. Oct 8, 2012

### D H

Staff Emeritus
Rhetorical question: Why do you think we use $\hat \imath$, $\hat \jmath$, and $\hat k$ as unit vectors?

Historically, the dot product and cross product as used by physicists on vectors in ℝ3 are much more closely associated with the quaternions that the concept of vector spaces in general, bivectors, and the wedge product. That close association is why one still sees $\hat \imath$, $\hat \jmath$, and $\hat k$ as the canonical unit vectors. That our 3 dimensional vectors are just one kind of vector, that the cross product is closely aligned with the wedge product: Those were after the fact clean-ups.

One last thing on the cross product: It's pretty much unique to ℝ3. The wedge product of two vectors is a bivector rather than a vector. The only ℝn where one can pretend that a bivector is a vector in that space is ℝ3. In any other dimension, a bivector has n*(n-1)/2 independent elements, and n*(n-1)/2=n only for n=3. Another generalization of the cross product is to use the matrix form of the cross product to define the product of n-1 vectors in ℝn. For example, the product of 3 vectors in ℝ4, 4 vectors in ℝ5. Yet another generalization is to go back to how the cross product was originally defined, which was in terms of the quaternion product. How about octonions, sedenions, and even larger generalizations of the complex numbers? With the octonions, there is a seven dimensional cross product, kind of. It's plagued with problems and isn't particularly useful. A 15 dimensional cross product based on the sedenion product just doesn't work. So there's a nice three dimensional cross product, a sorta/kinda seven dimensional cross product, and that's it.

8. Oct 8, 2012

### Hetware

I use them because they are common in physics books, form a basis, and make it easier for me to do 3D graphics. I actually use:

U0={0,0,0} (The unit null vector - yes that's a joke),
U1={1,0,0},
U2={0,2,0},
U3={0,0,3}.

I seem to recall some discussion along those lines. I believe it was Gibbs who gave us vectors as we know them in physics.

I have read that the curl of a vector can be written as $\frac{\partial u_i}{\partial x_k}-\frac{\partial u_k}{\partial x_i}$, but I have never gone very far with that. I believe that is true of all cross products. In other words, they can be written as antisymmetric second order tensors. I assume that is very similar to what the wedge product is.

This is what I believe to be (one of) the oldest formulation of the axioms of a vector space. It's the first one I learned:

Two vectors $\pmb{a}$ and $\pmb{b}$ uniquely determine a vector $\pmb{a} + \pmb{b}$ as their sum. A number $\lambda$ and a vector $\pmb{a}$ uniquely define a vector $\lambda\pmb{a}$ which "$\lambda$ times $\pmb{a}$" (multiplication). These operations are subject to the following laws:-
($\alpha$) Addition-
(1)$\pmb{a} + \pmb{b} = \pmb{b} + \pmb{a}$ (Commutative Law).
(2)$(\pmb{a} + \pmb{b}) + \pmb{c} = \pmb{a} + (\pmb{b} + \pmb{c})$(Associative Law).
(3) If $\pmb{a}$ and $\pmb{c}$ are two vectors , then there is one and only one value of $\pmb{x}$ for which the equation $\pmb{a}\pmb{ }+\pmb{ }\pmb{x}\pmb{ }=\pmb{ }\pmb{c}$ holds. It is called the difference between $\pmb{c}$ and $\pmb{a}$ and signifies $\pmb{c}-\pmb{a}$ (Possibility of Subtraction).

($\beta$) Multiplication-
(1)$(\lambda + \mu )\pmb{a} = (\lambda \pmb{a}) + (\mu \pmb{a})$ (First Distributive Law).
(2) $\lambda (\mu \pmb{a}) = (\lambda \mu )\pmb{a}$ (Associative Law).
(3) $1\pmb{a} = \pmb{a}$.
(4) $\lambda (\pmb{a} + \pmb{b}) = (\lambda \pmb{a}) + (\lambda \pmb{b})$ (Second Distributive Law).

For rational multipliers $λ$, $μ$, the laws ($\beta$) follow from addition if multiplication by such factors be defined from addition. In accordance with the principle of continuity we shall also make use of them for arbitrary real numbers, but we purposely formulate them as separate axioms because they cannot be derived in general form from the axioms of addition by logical reasoning alone. By refraining from reducing multiplication to addition we are enabled through these axioms to banish continuity, which is so difficult to fix precisely, from the logical structure of geometry. The law ($\beta$) 4 comprises the theorems of similarity.

($\gamma$ The "Axiom of Dimensionality'', which occupies the next place in the
system will be formulated later.

http://store.doverpublications.com/0486602672.html

But, as I say, I need to avoid this topic for now. I appreciate the feedback. It was helpful.

Last edited: Oct 8, 2012