Factoring a wedge product into vectors? (geometric algebra)

Click For Summary
SUMMARY

The discussion focuses on factoring a wedge product (blade) into a set of vectors using geometric algebra. A specific example is provided with a grade 2 blade, demonstrating the projection approach to find a basis for the hypervolume spanned by the wedge product. The algorithm for factoring a blade, as outlined in "Geometric Algebra for Computer Science" by Dorst et al., involves steps such as determining the norm, representing the blade in a basis, and using projection techniques to derive the factors. The conversation also highlights the comparison between projection products in geometric algebra and traditional matrix algebra, emphasizing the efficiency of using Singular Value Decomposition (SVD) for orthonormalization.

PREREQUISITES
  • Understanding of geometric algebra concepts, particularly wedge products and blades.
  • Familiarity with projection operators in linear algebra.
  • Knowledge of algorithms for orthonormalization, including Gram-Schmidt and SVD.
  • Basic understanding of norms and vector spaces.
NEXT STEPS
  • Study the algorithm for factoring blades as detailed in "Geometric Algebra for Computer Science" by Dorst et al.
  • Learn about Singular Value Decomposition (SVD) and its applications in orthonormalization.
  • Explore the differences between Euclidean and non-Euclidean metrics in geometric algebra.
  • Investigate the relationship between projection matrices in traditional linear algebra and geometric algebra.
USEFUL FOR

This discussion is beneficial for mathematicians, computer scientists, and engineers working with geometric algebra, particularly those involved in graphics applications and numerical methods for orthonormalization.

Peeter
Messages
303
Reaction score
3
Anybody know a good approach to factor a wedge product (blade) into a set of vectors? Loosely, I'd describe the prodlem as finding a basis for the hypervolume that the wedge product "spans".

Example to illustrate the question, taking a grade 2 blade, suppose one had something like:

<br /> A = (e_1 + e_2 + e_3) \wedge ( e_1 + 2 e_4 ) = 2 e_1 \wedge e_4 + e_2 \wedge e_1 + 2 e_2 \wedge e_4 + e_3 \wedge e_1 + 2 e_3 \wedge e_4<br />

The problem is to work backwards to find some pair of vectors that would generate the same wedge product.

One approach that I believe would work, but doesn't seem too efficient, is to utilize the wedge product as a projector onto the space:

<br /> Proj_A(x) = \frac{1}{A} (A \cdot x) = A^\dagger \frac{1}{A^\dagger A}(A \cdot x)<br />

(observe the geometric algebra dot product above).

If you applied this to the basis vectors for the space, you have essentially computed the coordinate vectors for the matrix of the projection operator, so these columns form a basis of the space, and are thus factors of the original wedge product. A column reduction to reduce the rank would provide the factorization desired (disregarding the scaling factor).

As an aside, I found it pretty interesting how similar the projection product is to the projection matrix from traditional matrix algebra, where provided the matrix A is of full column rank, a projection onto it's columns, can be written:

<br /> Proj_A(x) = \left( A \frac{1}{A^\text{T} A} A^\text{T} \right) x<br />

A further comparison. In the above if the columns are othonormal you have:

<br /> Proj_A(x) = \left( A A^\text{T} \right) x<br />

Similarily, for a unit magnitude blade, the inverse term drops out leaving:

<br /> Proj_A(x) = A \left(A^\dagger \cdot x\right)<br />
 
Physics news on Phys.org
Algorithm for factoring a blade (from Dorst et al. GA4CS p. 535):
1. Input is a non-zero blade B of grade k.
2. Determine the norm s =||B||
3. Represent the blade in a basis and then find the basis blade E in this
representation with the largest coordinate; now you have a list of k basis
vectors ei that span E.
4. Scale the input blade to produce Bc=B/s
5. For all but one of the basis vectors ei of E:
a) Project ei onto Bc: fi= (ei.Bc)Bc-1.
b) Normalize fi and add to list of factors.
c) Update Bc <--- fi-1.Bc.
6. Obtain the last factor fk=Bc and normalize it.
7. Output: the list of factors fi and the scale s.

Yes, it's that book again!
 
Actually, I have the book now;) This question occurred to me as soon as I started reading the meet/join chapter ... I jumped the gun posting this question. Later I found this in the appendix, and should have replied to my own post saying so. Sorry that you spent time typing this up!

Note that the central idea in that algorithm is the same ... use the projection. But they use a Graham Shmidt like procedure, using repeated reduction in grade, for orthonormalizing instead of doing it en-mass with something like SVD (which would also give you an orthonormal basis for the dual of the blade, and wouldn't have the numerical problems of GS). Since they only go up to 5 dimensions for their graphics applications the numerical issues probably don't hit that hard. I'm not sure how you'd apply SVD to orthonormalize with respect to a non-Euclian metric either.

ps. Yes, that is a good book! By far the best introduction I've seen. Thanks for the recommendation. The notational differences compared to the Hestenes and GAFP books takes some getting used to though.
 
Peeter said:
I'm not sure how you'd apply SVD to orthonormalize with respect to a non-Euclian metric either.

ps. Yes, that is a good book! By far the best introduction I've seen. Thanks for the recommendation. The notational differences compared to the Hestenes and GAFP books takes some getting used to though.

You would probably always use a Euclidean metric because the outer product factorization should not be dependent on the metric. I vaguely recall seeing this somewhere else in the book (probably in dealing with the meet and the join). Glad you like the book---I'm sure it will be a valuable asset.
 
His algorithm included a normalization, and that can use an alternate metric (ie: they use a Minkowski metric for their conformal modelling).
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K