Reconciling differential forms' inner product of wedge with geometric algebra dot

  • #1
305
3
My differential forms book (Flanders/Dover) defines an inner product on wedge products for vectors that have a defined inner product, and uses that to define the hodge dual. That wedge inner product definition was a determinant of inner products.
I don't actually have that book on me right now, but believe it was like so:

[tex]
\langle (a_1 \wedge \cdots \wedge a_k), (b_1 \wedge \cdots \wedge b_k) \rangle
= \lvert \langle a_i, b_j \rangle \rvert
[/tex]
(edit to correct small typo above).

This is a slightly awkward definition since you have to have the wedge explicitly specified in terms of factors to calculate, and it only works for like grades, and requires that the two wedge product elements be simple (ie: both sides must be blades, not neccessarily sum of multivectors of same grade). In it's favour it doesn't assume an euclianian metric like GA.

Anyways, ... I was trying to reconcile this with the GA definition. For the GA generalized dot product (lowest grade selector of a multivector product), I believe the equivalent explicit expansion in terms of determinant would be:

[tex]
(a_1 \wedge \cdots \wedge a_k) \cdot (b_1 \wedge \cdots \wedge b_k) =
(-1)^{k(k-1)/2}\lvert a_i \cdot b_j \rvert
[/tex]

I got this by repeated expansion of

[tex]
(a_1 \wedge \cdots \wedge a_k) \cdot (b_1 \wedge \cdots \wedge b_k)
=
a_1 \cdot (a_2 \cdot ( \cdots ( a_k \cdot (b_1 \wedge \cdots \wedge b_k))))
[/tex]

To be honest I found that I had to make a ``compensating error'' to adjust the sign to what I expected it to be (ie: to agree with manually explicit expansion for k=1,2,3,4). I don't see what my error is offhand, but was wondering if anybody had seen a determinant expansion of the GA dot product of like grade blades like this in any text to compare with.
 
Last edited:

Answers and Replies

  • #2
I think this would be more approriate under "Tensor Analysis and Differential Geometry" and would probably get better answers there.

I moving it there.
 
  • #3
all right. I thought of posting the question there, but the wedge product has it's own applications independent of the differential forms subject where it is typically found (example: solution of linear systems) and there wasn't really anything "differential" in my question, so I chose the algebra forum;)
 
  • #4
...was wondering if anybody had seen a determinant expansion of the GA dot product of like grade blades like this in any text to compare with.

In "GA for Computer Science" by Dorst et al., the scalar product is defined
in terms of a determinant:
[tex]
A*B := \left|\begin{array}{cccc}
a_1\cdot b_k & a_1\cdot b_{k-1} & \ldots & a_1\cdot b_1 \\
a_2\cdot b_k & a_2\cdot b_{k-1} & \ldots & a_2\cdot b_1 \\
\vdots & \vdots & \vdots & \vdots \\
a_k\cdot b_k & a_k\cdot b_{k-1} & \ldots & a_k\cdot b_1 \\
\end{array}
\right|.
[/tex]
 
  • #5
Thanks. That's consistent with what I thought it would be (has a reversion built into it that changes the sign relative to the differential forms definition).

This is the second time you refer to that book. Perhaps it's time I buy that one too;)

It's been a worthwhile exersize to go through this calculation though (did it slow and carefully in latex instead of paper to avoid my previous sign mixups). I hadn't gotten as far as I wanted, but even just considering the dot product of a bivector with a blade I now see a few additional relationships that complement the vector results that I didn't realize before.

1) Dot product of bivectors:

[tex]
(a_1 \wedge a_2) \cdot (b_1 \wedge b_2)
=
-
\begin{vmatrix}
a_1 \cdot b_1 & a_1 \cdot b_2 \\
a_2 \cdot b_1 & a_2 \cdot b_2 \\
\end{vmatrix}
=
-\lvert{a_i \cdot b_j}\rvert
[/tex]

2) Dot product of bivector with (grade>2) blade :

[tex]
(a_{1} \wedge a_2) \cdot (b_1 \wedge b_2 \cdots \wedge b_k))
=
\sum_{u<v} (-1)^{u+v-3}
(a_1 \wedge a_2) \cdot (b_u \wedge b_v)
(b_1 \wedge \cdots \check{b_u} \cdots \check{b_v} \cdots \wedge b_k)
[/tex]

Proof of this last one (for k=3) was one of the exersizes in New Foundations for Classical Mechanics... I'd previously only been able to do that excersize by expanding both sides to a mess of products of dot products and compare the LHS and RHS. Looking at the general case is a much better approach.
 
  • #6
This is the second time you refer to that book. Perhaps it's time I buy that one too;)

Do it! I can recommend the book unreservedly. It leads you by the nose through most of the basics and takes quite a different perspective from Hestenes and the Cambridge group. All three books are a must for someone like you. But don't take my word for it; go to :
http://www.lob.de/cgi-bin/work/framesetneu?flag=new&frame=yes&id=3d5fa0ff2dedd

The GA Bible is "Clifford Algebra to Geometric Calculus" by Hestenes and Sobczyk (1984, reprinted 2002) but is tough going and quite expensive. GA for Computer science is almost half the price of the other books and is an excellent introduction and has very good chapters on versors and conformal geometry.
 
  • #7
Okay, I'm sold. Sounds like this one would have been much better for learning from.

I've had to figure out a lot of stuff on my own using the Hestenes and Cambridge books. I'm learning the subject well that way, but they are definitely not good teaching books in many ways.

I think that a lot of the GA subject would make sense to be introduced as early as high school level:

- I always found the cross product objectionable since there should be a strict [tex]\mathbb{R}^2[/tex] formulation to complement projection.
- Making kids memorize Cramer's rule becomes totally pointless when one can use the wedge product much more naturally to solve linear systems.
- Complex numbers become much more natural when viewed as vectors with a unit vector factored out (no need to present this in terms of bizarre looking multiplication rules).
- and on and on.

BUT the presentation has to be dumbed down a lot and spelled out a lot better!
 

Suggested for: Reconciling differential forms' inner product of wedge with geometric algebra dot

Replies
21
Views
1K
Replies
13
Views
1K
Replies
5
Views
2K
Replies
4
Views
1K
Replies
4
Views
920
Replies
0
Views
1K
Replies
24
Views
2K
Replies
1
Views
2K
Back
Top