Find matrix corresponding to Linear Tranformation (Orth. Projection)

nlykkei
Messages
4
Reaction score
0
A linear operator L:Rn→Rn is called a projection if L^2=L. A projection L is an orthogonal projection if ker L is orthogonal to L(Rn).

I've shown that the only invertible projection is the identity map I_Rn by using function composition on the identity L2(v)=L.

Question: Now suppose that L is an orthogonal projection whose image L(Rn) has rank 1. Show there exist a unit vector v such that L is defined by the matrix vv^T. Conclude L is the orthogonal projection on S=Span(v).

I know I can write the orthogonal projection PS:Rn→S as uuT using an orthonormal basis {u} for S. And since v is a unit vector, {v} is an orthonormal basis for S, so PS=vv^T. So the only thing to show is that there exist such a vector v with the stated property ?

http://math.stackexchange.com/quest...r-v-such-that-l-is-defined-by-the-matrix-v-vt
 
Physics news on Phys.org
nlykkei said:
A linear operator L:Rn→Rn is called a projection if L^2=L. A projection L is an orthogonal projection if ker L is orthogonal to L(Rn).

I've shown that the only invertible projection is the identity map I_Rn by using function composition on the identity L2(v)=L.

Question: Now suppose that L is an orthogonal projection whose image L(Rn) has rank 1. Show there exist a unit vector v such that L is defined by the matrix vv^T. Conclude L is the orthogonal projection on S=Span(v).

I know I can write the orthogonal projection PS:Rn→S as uuT using an orthonormal basis {u} for S. And since v is a unit vector, {v} is an orthonormal basis for S, so PS=vv^T. So the only thing to show is that there exist such a vector v with the stated property ?

http://math.stackexchange.com/quest...r-v-such-that-l-is-defined-by-the-matrix-v-vt

If L has rank 1, then L(R^n) is one dimensional, so it's spanned by a single vector w. Let v=w/|w|. Isn't that a candidate for the vector you want?
 
Indeed it is. But how do I show that every vector in R^n is mapped equal by L and v v^T ?

I know I can write a vector h = i + j, where i lies in Span(v) and j lies in the orthogonal complement of Span(v).

Then L(h) = L(i) + L(j), but i = cv and suppose u is mapped to v: v = L(u) = L(L(u) = L(v), so v is mapped to itself.

Thus L(h) = i + L(j) - But how do I show that L(j) is mapped to 0 ? Then I have shown that every vector is mapped to its orthogonal projection on Span(v) !

Please if you know tell me, I have given this a lot of thought.
 
How about using the rank-nullity theorem to determine dim(ker L)?
 
dim(ker L) = n - 1, since dim(L(R^n)) = 1. But this doesn't show that L is given by v v^T for some vector v in L(R^n) ?

In the notation from above, how can I use this to conclude L(j) = 0 ?

Also I know ker L is a subset of the orthogonal complement to span(v) - But how do I show the opposite ?
 
I was thinking that since the orthogonal complement of span(v) is (n-1)-dimensional, and ker L is an (n-1)-dimensional subspace of that, they have to be the same.
 
Ahh, yeah we have ker L is a subset of S := orthogonal complement of span(v) and dim(ker L) = dim(S)) implies ker L = S ?

So u in S implies L(u) = 0, so for any w in R^n we have that L(w) is mapped to the orthogonal projection of w on span(v) !

And the matrix for this transformation is indeed v v^T (using a well-known theorem)

Thanks so much !
 
nlykkei said:
Ahh, yeah we have ker L is a subset of S := orthogonal complement of span(v) and dim(ker L) = dim(S)) implies ker L = S ?
Right. The idea is that if there's a u in S that isn't a linear combination of n-1 linearly independent vectors from ker L, then S is at least n-dimensional. So if we have also proved that S is (n-1)-dimensional, we have a contradiction, and can conclude that every u in S is a linear combination of vectors in ker L. This implies that S is a subset of ker L.

The problem statement tells us that ker L is a subset of S, so now (assuming that we know that ker L is a subspace) we know that S=ker L.
 
Back
Top