Dot and Cross Product from Rotation Matrix

In summary, the conversation discusses the discovery of a product called the geometric product which captures both the aspects of the dot product and the cross product. This product comes from an extension of vector algebra called geometric algebra and makes it easier to work with bivectors, or planes, in any number of dimensions. The conversation also explores the connection between geometric algebra and complex numbers and quaternions, and how it allows for rotations using "complex" exponentials.
  • #1
Hetware
125
1
I'm just learning this Latex(sic) formatting, so it's not ideal.

I was trying to explore the geometrical significance of the cross product when I happened upon an interesting observation. I've seen things like this before, but never had time to really examine them.

I define two vectors:

[itex]\pmb{A}=A^x\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+A^y\overset{\pmb{{}^{\wedge}}}{\pmb{j}}=A\left(\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}\right)[/itex]

[itex]\pmb{B}=B^x\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+B^y\overset{\pmb{{}^{\wedge}}}{\pmb{j}}=B\left(\cos (\beta )\overset{\pmb{{}^{\wedge}}}{\pmb{i}}+\sin (\beta )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}\right)[/itex]

Express a new basis with the x-axis aligned with the first vector:

[itex]\pmb{\hat{i}}=\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'[/itex]

[itex]\pmb{\hat{j}}=\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'[/itex]

Write the second vector in terms of the new basis and fiddle with it some:

[itex]\pmb{B}=B^x\left(\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)[/itex]
[itex]+B^y\left(\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)[/itex]

[itex]=B(\cos (\beta )(\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}')+\sin (\beta )(\sin (\alpha )\pmb{\hat{i}'}+\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'))[/itex]


[itex]=B\left(\cos (\beta )\cos (\alpha )\pmb{\hat{i}'}\pmb{-}\cos (\beta )\sin (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'+\sin (\beta )\sin (\alpha )\pmb{\hat{i}'}+\sin (\beta )\cos (\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right)[/itex]


[itex]=B\left[(\cos (\beta )\cos (\alpha )+\sin (\beta )\sin (\alpha ))\pmb{\hat{i}}'+(\pmb{-}\cos (\beta )\sin (\alpha )+\sin (\beta )\cos (\alpha ))\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right][/itex]


[itex]=B\left[\cos (\beta -\alpha )\pmb{\hat{i}}'+\sin (\beta -\alpha )\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right][/itex]

[itex]=B^{x'}\overset{\pmb{{}^{\wedge}}}{\pmb{i}}'+B^{y'}\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'[/itex]

Notice that the [itex]B^{x'}[/itex] component is just the magnitude of the cross product divided by the magnitude of [itex]\pmb{A}[/itex].

Use my own definition of a "complete product" and see that the first term is the dot product, and the second term is the cross product:

[itex]\pmb{AB}=AB\left[(\cos (\beta )\cos (\alpha )+\sin (\beta )\sin (\alpha ))\pmb{\hat{i}}'+(\pmb{-}\cos (\beta )\sin (\alpha )+\sin (\beta )\cos (\alpha ))\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'\right][/itex]

[itex]\pmb{AB}=\left.(A^xB^x+A^yB^y\right)\overset{\pmb{{}^{\wedge}}}{\pmb{i}}'+\left.(A^xB^y-A^yB^x\right)\overset{\pmb{{}^{\wedge}}}{\pmb{j}}'[/itex]

I believe this works in 3 dimensions. Has anybody seen a development of this line of reasoning regarding vector products?
 
Physics news on Phys.org
  • #2
You can hat vectors with \hat--e.g. \hat u -> [itex]\hat u[/itex].

You've discovered a product called the geometric product--essentially, at least. The geometric product comes from an extension of vector algebra called geometric algebra, but the geometric product is easy enough to explain on its own. You have an orthonormal basis [itex] i, j[/itex]. The geometric product is defined as so:

[tex]i i = jj = 1, \quad ij = -ji[/tex]

That's it. This captures both the aspects of the dot product (symmetry when the vectors are the same) and the cross product (antisymmetry when the vectors are different).

Now, you might be asking, what is [itex]ij[/itex] then? We call it a bivector, and we say that it represents a plane, just as a vector represents a line. The associativity of the geometric product makes it easy to work with such things.

In geometric algebra, we also work with a "wedge" product instead of the cross product because the cross product doesn't exist in dimensions other than 3 or 7. The geometric product can then be seen as the sum of the dot and wedge products.

[itex]ab = a \cdot b + a \wedge b[/itex] for vectors [itex]a, b[/itex].

What you wrote would be written in terms of geometric algebra as the following:

[tex]B = B A A^{-1} = (B \cdot A) A^{-1} + (B \wedge A) A^{-1}[/tex]

Geometric algebra makes it reasonable to define the inverse of a vector. [itex]A^{-1} = A/(A\cdot A) = A/A^2[/itex], for instance. By the associativity of the geometric product, we can always multiply a vector by [itex]A A^{-1} = 1[/itex] in Euclidean space and then group by associativity to do as you have done. This is valid in any number of dimensions, too. Two vectors always define a plane, and vectors in a plane can always be decomposed into sines and cosines with respect to some reference direction.

Geometric algebra gives great insights into ideas of vector algebra that usually have to be presented without proof. It's a difficult subject, though, and not something I recommend for the faint of heart. Even without geometric algebra, rest assured you've made a valid observation.
 
  • #3
I'm way done thinking for the night, but has this anything to do with quaternions? ii=jj=1,ij=−ji looks familiar. Good night.
 
  • #4
With regard to quaternions, you have i^2 = j^2 = k^2 = -1 and if you let the scalar part be equal to zero, you get the multiplication of quaternions to give the vector part as the cross product as a special case.
 
  • #5
Indeed, geometric algebra has an intimate connection to complex numbers and quaternions.

Let me use [itex]e_x, e_y, e_z[/itex] as the basis vectors for the rest of this post. It will make it easier for me to draw connections between objects in geometric algebra and the complex number [itex]i[/itex] or the quaternionic imaginaries [itex]i,j,k[/itex] this way.

Consider the 2d plane as a real vector space. In geometric algebra, we can form the object [itex]e_x e_y[/itex]. Observe that [itex](e_x e_y)^2 = e_x e_y e_x e_y = -e_x e_y e_y e_x = -1[/itex]. Thus, this unit bivector acts like an imaginary unit, and it's not uncommon to call this object [itex]i[/itex], just like the complex number counterpart. This is a big thing, for it means that all of complex analysis can be rewritten in terms of the real(-valued) geometric algebra and its associated calculus. For instance, take a function [itex]f(x) = u(x) + i v(x)[/itex] and find its vector derivative:

[tex]\nabla f(x) = \left( e_x \frac{\partial}{\partial x} + e_y \frac{\partial}{\partial y} \right) (u + iv) = e_x \left(\frac{\partial u}{\partial x} - \frac{\partial v}{\partial y} \right) + e_y \left( \frac{\partial u}{\partial y} + \frac{\partial v}{\partial x} \right)[/tex]

This may look familiar--it's the Cauchy-Riemann condition for complex differentiability if we set [itex]\nabla f = 0[/itex]. We also know this is an integrability condition, and this is just the tip of the iceberg as far as connecting complex analysis to vector analysis. A further investigation would show that the Cauchy Integral Theorem is just an extension of Stokes' Theorem, for example.

But you asked about quaternions, and let me get closer to that. The identification of [itex]i = e_x e_y[/itex] allows use "complex" exponentials to do rotations. I submit that a vector [itex]a[/itex] goes to its rotated counterpart [itex]\underline R(a)[/itex] by

[tex]\underline R(a) = \exp(-i \theta/2) a \exp(i \theta/2)[/tex]

where [itex]\exp(-i \theta/2) = \cos (\theta/2) - i \sin(\theta/2)[/itex] is called a rotor. This sandwiching is not strictly necessary in 2d--it can be simplified to [itex]a \exp(i \theta)[/itex] as you'd expect--but it is necessary in 3d. In 3d, there's more than just one bivector which squares to -1. [itex]e_y e_z, e_z e_x[/itex] also act as "imaginary" units. To complete the connection to quaternions, we set

[tex]\begin{align*}
i &\equiv -e_y e_z \\
j &\equiv -e_z e_x \\
k &\equiv -e_x e_y
\end{align*}[/tex]

We can check that these definitions are consistent with the usual ones for the quaternion algebra:

[tex]ij = (-e_y e_z) (-e_z e_x) = e_y e_x = - e_x e_y = k[/tex]

The other definitions of quaternion multiplication check as well. In geometric algebra, we often just work directly with bivectors like [itex]e_z e_x[/itex] without giving them special names ([itex]i, j, k[/itex]). Another advantage of GA is that we can use this "complex" exponential form (that is, using "rotors") with spherical or cylindrical or otherwise arbitrary basis vectors to do rotations. Additionally, the rotor form of performing rotations makes obvious how to extend this formalism to higher dimensions.
 
  • #6
Wow! This is like going home after years of being away and asking about the girlfriend who was the reason you left, only to find out she's still single and talks about you all the time.

I know this stuff is important. It's used in differential geometry, and is pragmatically valuable in 3D graphics. It also has it's own intellectual appeal, but I can't get into it now.

If I recall correctly, the story goes something like: once upon a time there was a high school teacher named Grassmann ... And there was an Irishman on a bridge, or something like that.
 
  • #7
Hetware said:
I'm way done thinking for the night, but has this anything to do with quaternions? ii=jj=1,ij=−ji looks familiar. Good night.
Rhetorical question: Why do you think we use [itex]\hat \imath[/itex], [itex]\hat \jmath[/itex], and [itex]\hat k[/itex] as unit vectors?

Historically, the dot product and cross product as used by physicists on vectors in ℝ3 are much more closely associated with the quaternions that the concept of vector spaces in general, bivectors, and the wedge product. That close association is why one still sees [itex]\hat \imath[/itex], [itex]\hat \jmath[/itex], and [itex]\hat k[/itex] as the canonical unit vectors. That our 3 dimensional vectors are just one kind of vector, that the cross product is closely aligned with the wedge product: Those were after the fact clean-ups.

One last thing on the cross product: It's pretty much unique to ℝ3. The wedge product of two vectors is a bivector rather than a vector. The only ℝn where one can pretend that a bivector is a vector in that space is ℝ3. In any other dimension, a bivector has n*(n-1)/2 independent elements, and n*(n-1)/2=n only for n=3. Another generalization of the cross product is to use the matrix form of the cross product to define the product of n-1 vectors in ℝn. For example, the product of 3 vectors in ℝ4, 4 vectors in ℝ5. Yet another generalization is to go back to how the cross product was originally defined, which was in terms of the quaternion product. How about octonions, sedenions, and even larger generalizations of the complex numbers? With the octonions, there is a seven dimensional cross product, kind of. It's plagued with problems and isn't particularly useful. A 15 dimensional cross product based on the sedenion product just doesn't work. So there's a nice three dimensional cross product, a sorta/kinda seven dimensional cross product, and that's it.
 
  • #8
D H said:
Rhetorical question: Why do you think we use [itex]\hat \imath[/itex], [itex]\hat \jmath[/itex], and [itex]\hat k[/itex] as unit vectors?

I use them because they are common in physics books, form a basis, and make it easier for me to do 3D graphics. I actually use:

U0={0,0,0} (The unit null vector - yes that's a joke),
U1={1,0,0},
U2={0,2,0},
U3={0,0,3}.

D H said:
Historically, the dot product and cross product as used by physicists on vectors in ℝ3 are much more closely associated with the quaternions that the concept of vector spaces in general, bivectors, and the wedge product. That close association is why one still sees [itex]\hat \imath[/itex], [itex]\hat \jmath[/itex], and [itex]\hat k[/itex] as the canonical unit vectors. That our 3 dimensional vectors are just one kind of vector, that the cross product is closely aligned with the wedge product: Those were after the fact clean-ups.

I seem to recall some discussion along those lines. I believe it was Gibbs who gave us vectors as we know them in physics.

D H said:
One last thing on the cross product: It's pretty much unique to ℝ3. The wedge product of two vectors is a bivector rather than a vector. The only ℝn where one can pretend that a bivector is a vector in that space is ℝ3. In any other dimension, a bivector has n*(n-1)/2 independent elements, and n*(n-1)/2=n only for n=3. Another generalization of the cross product is to use the matrix form of the cross product to define the product of n-1 vectors in ℝn. For example, the product of 3 vectors in ℝ4, 4 vectors in ℝ5. Yet another generalization is to go back to how the cross product was originally defined, which was in terms of the quaternion product. How about octonions, sedenions, and even larger generalizations of the complex numbers? With the octonions, there is a seven dimensional cross product, kind of. It's plagued with problems and isn't particularly useful. A 15 dimensional cross product based on the sedenion product just doesn't work. So there's a nice three dimensional cross product, a sorta/kinda seven dimensional cross product, and that's it.

I have read that the curl of a vector can be written as [itex]\frac{\partial u_i}{\partial x_k}-\frac{\partial u_k}{\partial x_i}[/itex], but I have never gone very far with that. I believe that is true of all cross products. In other words, they can be written as antisymmetric second order tensors. I assume that is very similar to what the wedge product is.

This is what I believe to be (one of) the oldest formulation of the axioms of a vector space. It's the first one I learned:

Two vectors [itex]\pmb{a}[/itex] and [itex]\pmb{b}[/itex] uniquely determine a vector [itex]\pmb{a} + \pmb{b}[/itex] as their sum. A number [itex]\lambda[/itex] and a vector [itex]\pmb{a}[/itex] uniquely define a vector [itex]\lambda\pmb{a}[/itex] which "[itex]\lambda[/itex] times [itex]\pmb{a}[/itex]" (multiplication). These operations are subject to the following laws:-
([itex]\alpha[/itex]) Addition-
(1)[itex]\pmb{a} + \pmb{b} = \pmb{b} + \pmb{a}[/itex] (Commutative Law).
(2)[itex](\pmb{a} + \pmb{b}) + \pmb{c} = \pmb{a} + (\pmb{b} + \pmb{c})[/itex](Associative Law).
(3) If [itex]\pmb{a}[/itex] and [itex]\pmb{c}[/itex] are two vectors , then there is one and only one value of [itex]\pmb{x}[/itex] for which the equation [itex]\pmb{a}\pmb{ }+\pmb{ }\pmb{x}\pmb{ }=\pmb{ }\pmb{c}[/itex] holds. It is called the difference between [itex]\pmb{c}[/itex] and [itex]\pmb{a}[/itex] and signifies [itex]\pmb{c}-\pmb{a}[/itex] (Possibility of Subtraction).

([itex]\beta[/itex]) Multiplication-
(1)[itex](\lambda + \mu )\pmb{a} = (\lambda \pmb{a}) + (\mu \pmb{a})[/itex] (First Distributive Law).
(2) [itex]\lambda (\mu \pmb{a}) = (\lambda \mu )\pmb{a}[/itex] (Associative Law).
(3) [itex]1\pmb{a} = \pmb{a}[/itex].
(4) [itex]\lambda (\pmb{a} + \pmb{b}) = (\lambda \pmb{a}) + (\lambda \pmb{b})[/itex] (Second Distributive Law).

For rational multipliers [itex]λ[/itex], [itex]μ[/itex], the laws ([itex]\beta[/itex]) follow from addition if multiplication by such factors be defined from addition. In accordance with the principle of continuity we shall also make use of them for arbitrary real numbers, but we purposely formulate them as separate axioms because they cannot be derived in general form from the axioms of addition by logical reasoning alone. By refraining from reducing multiplication to addition we are enabled through these axioms to banish continuity, which is so difficult to fix precisely, from the logical structure of geometry. The law ([itex]\beta[/itex]) 4 comprises the theorems of similarity.

([itex]\gamma[/itex] The "Axiom of Dimensionality'', which occupies the next place in the
system will be formulated later.

http://store.doverpublications.com/0486602672.html

But, as I say, I need to avoid this topic for now. I appreciate the feedback. It was helpful.
 
Last edited:

What is the dot product of a rotation matrix?

The dot product of a rotation matrix is a mathematical operation that calculates the scalar value of two vectors. In the context of rotation matrices, it is used to determine the projection of one vector onto another, which can be used to find the angle between the two vectors.

How is the dot product related to rotation matrices?

The dot product is used in rotation matrices to determine the cosine of the angle between two vectors, which is then used to calculate the rotation angle. The dot product is also used in the matrix multiplication process to combine multiple rotation matrices.

What is the cross product of a rotation matrix?

The cross product of a rotation matrix is a mathematical operation that calculates a new vector that is perpendicular to both of the original vectors. In the context of rotation matrices, it is used to determine the axis of rotation.

How do you use the cross product to find the axis of rotation?

To find the axis of rotation using the cross product, you first need to calculate the cross product of two non-parallel vectors in the rotation matrix. The resulting vector will be perpendicular to both of the original vectors and will indicate the direction of the axis of rotation. It can then be normalized to obtain the unit vector representing the axis of rotation.

What is the relationship between the dot product and cross product in rotation matrices?

The dot product and cross product are both used in rotation matrices to calculate different aspects of the rotation. The dot product is used to calculate the rotation angle, while the cross product is used to determine the axis of rotation. Both operations are necessary for fully defining a rotation matrix and understanding the resulting rotation in 3D space.

Similar threads

  • Differential Equations
Replies
1
Views
716
Replies
3
Views
1K
  • Differential Equations
Replies
1
Views
624
  • Linear and Abstract Algebra
Replies
1
Views
613
  • Linear and Abstract Algebra
Replies
11
Views
940
Replies
24
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
824
Replies
4
Views
3K
  • Introductory Physics Homework Help
Replies
5
Views
403
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Back
Top