Find matrix corresponding to Linear Tranformation (Orth. Projection)

Click For Summary

Homework Help Overview

The discussion revolves around the properties of linear operators, specifically orthogonal projections in the context of linear algebra. The original poster explores the conditions under which a linear operator defined on R^n can be represented by a matrix of the form vv^T, where v is a unit vector. The focus is on establishing the existence of such a vector when the rank of the projection is 1.

Discussion Character

  • Conceptual clarification, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the definition of orthogonal projections and the implications of rank 1 on the image of the operator. Questions arise about the existence of a unit vector v and how to demonstrate that L maps vectors in R^n consistently with the projection defined by vv^T. The use of the rank-nullity theorem and the relationship between the kernel and the orthogonal complement of Span(v) are also explored.

Discussion Status

Participants are actively engaging with the problem, raising questions about the relationships between different vector spaces and the properties of the linear operator. Some suggest using the rank-nullity theorem to analyze the dimensions of the kernel and the image, while others propose examining specific vectors to illustrate the projection properties. There is a productive exchange of ideas, but no consensus has been reached on the final argument structure.

Contextual Notes

Constraints include the requirement to show the existence of a unit vector v and the implications of the rank of the projection. Participants are also considering the dimensional relationships between the kernel and the orthogonal complement of Span(v), which are critical to the discussion.

nlykkei
Messages
4
Reaction score
0
A linear operator L:Rn→Rn is called a projection if L^2=L. A projection L is an orthogonal projection if ker L is orthogonal to L(Rn).

I've shown that the only invertible projection is the identity map I_Rn by using function composition on the identity L2(v)=L.

Question: Now suppose that L is an orthogonal projection whose image L(Rn) has rank 1. Show there exist a unit vector v such that L is defined by the matrix vv^T. Conclude L is the orthogonal projection on S=Span(v).

I know I can write the orthogonal projection PS:Rn→S as uuT using an orthonormal basis {u} for S. And since v is a unit vector, {v} is an orthonormal basis for S, so PS=vv^T. So the only thing to show is that there exist such a vector v with the stated property ?

http://math.stackexchange.com/quest...r-v-such-that-l-is-defined-by-the-matrix-v-vt
 
Physics news on Phys.org
nlykkei said:
A linear operator L:Rn→Rn is called a projection if L^2=L. A projection L is an orthogonal projection if ker L is orthogonal to L(Rn).

I've shown that the only invertible projection is the identity map I_Rn by using function composition on the identity L2(v)=L.

Question: Now suppose that L is an orthogonal projection whose image L(Rn) has rank 1. Show there exist a unit vector v such that L is defined by the matrix vv^T. Conclude L is the orthogonal projection on S=Span(v).

I know I can write the orthogonal projection PS:Rn→S as uuT using an orthonormal basis {u} for S. And since v is a unit vector, {v} is an orthonormal basis for S, so PS=vv^T. So the only thing to show is that there exist such a vector v with the stated property ?

http://math.stackexchange.com/quest...r-v-such-that-l-is-defined-by-the-matrix-v-vt

If L has rank 1, then L(R^n) is one dimensional, so it's spanned by a single vector w. Let v=w/|w|. Isn't that a candidate for the vector you want?
 
Indeed it is. But how do I show that every vector in R^n is mapped equal by L and v v^T ?

I know I can write a vector h = i + j, where i lies in Span(v) and j lies in the orthogonal complement of Span(v).

Then L(h) = L(i) + L(j), but i = cv and suppose u is mapped to v: v = L(u) = L(L(u) = L(v), so v is mapped to itself.

Thus L(h) = i + L(j) - But how do I show that L(j) is mapped to 0 ? Then I have shown that every vector is mapped to its orthogonal projection on Span(v) !

Please if you know tell me, I have given this a lot of thought.
 
How about using the rank-nullity theorem to determine dim(ker L)?
 
dim(ker L) = n - 1, since dim(L(R^n)) = 1. But this doesn't show that L is given by v v^T for some vector v in L(R^n) ?

In the notation from above, how can I use this to conclude L(j) = 0 ?

Also I know ker L is a subset of the orthogonal complement to span(v) - But how do I show the opposite ?
 
I was thinking that since the orthogonal complement of span(v) is (n-1)-dimensional, and ker L is an (n-1)-dimensional subspace of that, they have to be the same.
 
Ahh, yeah we have ker L is a subset of S := orthogonal complement of span(v) and dim(ker L) = dim(S)) implies ker L = S ?

So u in S implies L(u) = 0, so for any w in R^n we have that L(w) is mapped to the orthogonal projection of w on span(v) !

And the matrix for this transformation is indeed v v^T (using a well-known theorem)

Thanks so much !
 
nlykkei said:
Ahh, yeah we have ker L is a subset of S := orthogonal complement of span(v) and dim(ker L) = dim(S)) implies ker L = S ?
Right. The idea is that if there's a u in S that isn't a linear combination of n-1 linearly independent vectors from ker L, then S is at least n-dimensional. So if we have also proved that S is (n-1)-dimensional, we have a contradiction, and can conclude that every u in S is a linear combination of vectors in ker L. This implies that S is a subset of ker L.

The problem statement tells us that ker L is a subset of S, so now (assuming that we know that ker L is a subspace) we know that S=ker L.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
8K
Replies
5
Views
3K
Replies
6
Views
2K
  • · Replies 13 ·
Replies
13
Views
6K