Proving the Orthogonal Projection Formula for Vector Subspaces

  • Context: Undergrad 
  • Thread starter Thread starter member 428835
  • Start date Start date
  • Tags Tags
    Orthogonal Projections
Click For Summary
SUMMARY

The orthogonal projection of a vector ##v## onto the subspace spanned by unit vectors ##e_1,...,e_n## is defined as $$\sum_j\langle e_j,v \rangle e_j$$. This formula can be proven using the properties of projectors, specifically that ##P^2 = P## and ##P = P^*##. By expressing the vector ##\mathbf v## as a linear combination of the orthonormal basis vectors and applying the inner product, one can derive the coefficients that confirm the projection formula. The discussion emphasizes the importance of orthogonality in establishing linear independence within the context of vector spaces.

PREREQUISITES
  • Understanding of vector spaces and subspaces
  • Familiarity with inner product spaces
  • Knowledge of orthonormal bases and their properties
  • Basic linear algebra concepts, including linear independence
NEXT STEPS
  • Study the properties of projectors in linear algebra
  • Learn about the Gram-Schmidt process for orthonormalization
  • Explore applications of orthogonal projections in machine learning
  • Investigate the implications of linear independence in higher-dimensional spaces
USEFUL FOR

Mathematicians, physics students, and anyone studying linear algebra, particularly those interested in vector projections and their applications in various fields.

member 428835
Hi PF!

I've been reading and it appears that the orthogonal projection of a vector ##v## to the subspace spanned by ##e_1,...,e_n## is given by $$\sum_j\langle e_j,v \rangle e_j$$ (##e_j## are unit vectors, so ignore the usual inner product denominator for simplicity) but there is never a proof in the texts. It's always given by definition or I see "trivial" next to the proof. Surely this is something we prove, right?
 
Physics news on Phys.org
it really depends on what you're trying to do here, as there are lots of flavors of this. Suppose we have vectors in ##\mathbb C^m## with ##m \geq n## and use the standard inner product.

you have some vector ##\mathbf u## and you project it to a subspace with these n mutually orthornormal vectors. You can do this explicitly with ##\mathbf v = P\mathbf u## where ##P## is a projector -- i.e. ##P^2 = P## and ##P = P^*##.

so we have a generating set for this subspace and use it to write ##\mathbf v## as a linear combination of these. (Note: ignoring the zero vector: if you have an inner product, orthogonality implies linear independence, why?)

so
##\mathbf v = \sum_{j=1}^n \alpha_j \mathbf e_j##

multiply on the left by ##\mathbf e_k^*## (conjugate transpose) to see

##\mathbf e_k^*\mathbf v = \mathbf e_k^*\sum_{j=1}^n \alpha_j \mathbf e_j = \sum_{j=1}^n \alpha_j \mathbf e_k^*\mathbf e_j = \alpha_k \mathbf e_k^*\mathbf e_k = \alpha_k##

now re-write this with general inner product notation -- same result.
 
  • Like
Likes   Reactions: member 428835
StoneTemplePython said:
(Note: ignoring the zero vector: if you have an inner product, orthogonality implies linear independence, why?)
This is simple to show: take ##c_1v_1+...+c_nv_n = 0## and dot this into some ##v_j##. Then orthogonality implies for non-zero ##v##, ##c_j=0##, and thus linear independence.

And thanks! Good to see it.
 
Last edited by a moderator:

Similar threads

  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K