Undergrad Proving the Orthogonal Projection Formula for Vector Subspaces

Click For Summary
The orthogonal projection of a vector v onto a subspace spanned by unit vectors e_1,...,e_n is expressed as the sum of the inner products of these vectors with v, specifically $$\sum_j\langle e_j,v \rangle e_j$$. The discussion emphasizes the need for a proof of this formula, as it is often presented without one in texts. It highlights that when projecting a vector u onto a subspace defined by n mutually orthonormal vectors, the projection can be explicitly calculated using a projector P, which satisfies P^2 = P and P = P*. The relationship between orthogonality and linear independence is also noted, explaining that orthogonality implies linear independence in the context of inner products. Overall, the conversation seeks to clarify the mathematical foundations of the orthogonal projection formula.
member 428835
Hi PF!

I've been reading and it appears that the orthogonal projection of a vector ##v## to the subspace spanned by ##e_1,...,e_n## is given by $$\sum_j\langle e_j,v \rangle e_j$$ (##e_j## are unit vectors, so ignore the usual inner product denominator for simplicity) but there is never a proof in the texts. It's always given by definition or I see "trivial" next to the proof. Surely this is something we prove, right?
 
Physics news on Phys.org
it really depends on what you're trying to do here, as there are lots of flavors of this. Suppose we have vectors in ##\mathbb C^m## with ##m \geq n## and use the standard inner product.

you have some vector ##\mathbf u## and you project it to a subspace with these n mutually orthornormal vectors. You can do this explicitly with ##\mathbf v = P\mathbf u## where ##P## is a projector -- i.e. ##P^2 = P## and ##P = P^*##.

so we have a generating set for this subspace and use it to write ##\mathbf v## as a linear combination of these. (Note: ignoring the zero vector: if you have an inner product, orthogonality implies linear independence, why?)

so
##\mathbf v = \sum_{j=1}^n \alpha_j \mathbf e_j##

multiply on the left by ##\mathbf e_k^*## (conjugate transpose) to see

##\mathbf e_k^*\mathbf v = \mathbf e_k^*\sum_{j=1}^n \alpha_j \mathbf e_j = \sum_{j=1}^n \alpha_j \mathbf e_k^*\mathbf e_j = \alpha_k \mathbf e_k^*\mathbf e_k = \alpha_k##

now re-write this with general inner product notation -- same result.
 
  • Like
Likes member 428835
StoneTemplePython said:
(Note: ignoring the zero vector: if you have an inner product, orthogonality implies linear independence, why?)
This is simple to show: take ##c_1v_1+...+c_nv_n = 0## and dot this into some ##v_j##. Then orthogonality implies for non-zero ##v##, ##c_j=0##, and thus linear independence.

And thanks! Good to see it.
 
Last edited by a moderator:
I am studying the mathematical formalism behind non-commutative geometry approach to quantum gravity. I was reading about Hopf algebras and their Drinfeld twist with a specific example of the Moyal-Weyl twist defined as F=exp(-iλ/2θ^(μν)∂_μ⊗∂_ν) where λ is a constant parametar and θ antisymmetric constant tensor. {∂_μ} is the basis of the tangent vector space over the underlying spacetime Now, from my understanding the enveloping algebra which appears in the definition of the Hopf algebra...

Similar threads

  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
3
Views
2K