Moore penrose inverse expressed as geometric product?

Click For Summary

Discussion Overview

The discussion revolves around the expression of the Moore-Penrose inverse using geometric products in the context of geometric algebra and linear transformations. Participants explore the relationship between geometric algebra concepts and traditional matrix inversion methods, particularly focusing on the implications for projections and generalized inverses.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant suggests that the inverse of a linear transformation can be expressed as a geometric product, using the adjoint and pseudoscalar multiplication, rather than traditional matrix inversion.
  • Another participant notes that the geometric product vector inverse aligns with the Moore-Penrose generalized inverse of a Nx1 matrix, emphasizing its connection to projections onto the matrix image.
  • A participant expresses curiosity about further exploring this connection and wonders if others have previously addressed it.
  • One participant proposes using singular value decomposition (SVD) of a pseudoinverse to gain intuition about the geometric aspects of the inverse, mentioning the benefits of SVD in providing insights into rank and orthonormal bases.
  • A later reply reflects on the power of SVD, highlighting its ability to encapsulate multiple linear algebra concepts, including rank and pseudoinverses.

Areas of Agreement / Disagreement

Participants express interest in the connections between geometric algebra and linear algebra, but there is no consensus on whether the Moore-Penrose inverse can be definitively expressed using geometric products, and the discussion remains exploratory.

Contextual Notes

Participants acknowledge that the exploration of these concepts may depend on specific definitions and assumptions related to geometric algebra and linear transformations, and some steps in the reasoning may be unresolved.

Who May Find This Useful

This discussion may be of interest to those studying geometric algebra, linear algebra, or anyone exploring the connections between different mathematical frameworks in the context of linear transformations and inverses.

Peeter
Messages
303
Reaction score
3
I getting far enough into my geometric algebra books now that I'm at linear transformations, including the result showing how a linear transformation inverse can be expressed directly as a geometric product, using the adjoint and pseudoscalar multiplication, instead of using matrix inversion.

Really I think it amounts to the same thing since to calculate the adjoint of the linear transformation, you'll have to pick a basis and calculate the reciprocal frame vectors to find the components of the adjoint transformation (at least that's the way "Geometric algebra for physicists" outlines it).

Anyhow, the idea is interesting, and makes me wonder if it can be carried further. In particular I observe that the geometric product vector inverse is consistent with the moore penrose "generalized inverse" of a Nx1 matrix. Since that inverse is intrinsically related to projection onto the matrix image, and we can express subspace projection so naturally with the geometric product (ie: dot product of blades), I'd guess that the Moore Penrose inverse could be also be expressed using the geometric product, and that this may highlight some interesting structural features of the generalized inverse hard to see in matrix form.

Since I'm a newbie to the subject I'd also guess that somebody else has already done this (it's not in my books though). Any pointers to where to look (if not would probably be fun to try to calculate)?
 
Physics news on Phys.org
Have you tried singular value decomposition of a pseudoinverse? Then you will have rotation, scaling and rotation again... That will give you the intuition.
 
thanks for the hint. I'll look at that (that's mentioned later in the linear algebra chapter but I hadn't worked through an example or details and didn't make the connection).
 
Took a look at some info on SVD. Wow, that's a pretty powerful construct. Gives you rank, an orthonormal basis for both kernel and image, and inverse or pseudoinverse ... all in one shot.
 

Similar threads

Replies
0
Views
672
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 43 ·
2
Replies
43
Views
8K
  • · Replies 1 ·
Replies
1
Views
869
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K