Symmetric and idempotent matrix = Projection matrix

In summary: No! If ##v## is an eigenvector with eigenvalue zero, then ##Av=0##. Can ##v## be in the image of ##A## if you have a basis of...Yes! If ##v## is an eigenvector with eigenvalue zero, then ##Av=0##.
  • #1
pyroknife
613
3

Homework Statement


Consider a symmetric n x n matrix ##A## with ##A^2=A##. Is the linear transformation ##T(\vec{x})=A\vec{x}## necessarily the orthogonal projection onto a subspace of ##R^n##?

Homework Equations


Symmetric matrix means ##A=A^T##

An orthogonal projection matrix is given by
##P=A(A^TA)^{-1}A^T## (1)

The Attempt at a Solution



We are given that ##A## is symmetric and idempotent. My procedure is to see if A satisfies equation (1).

Plugging in ##A=A^2## into (1): we get
##A^2(A^2^TA^2)^{-1}*(A^2)^T##
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
I feel like I accomplished nothing with my solution procedure.
 
  • #3
Just attempted it again:
P is an orthogonal projection matrix IFF it is symmetric and idempotent.
Let A be the orthogonal projection matrix. Thus ##A## can be written as such:
##A=B(B^TB)^{-1}B^T##
for a matrix ##B## whose columns vectors form a basis for the column space of A.

##A^T=(B(B^TB)^{-1}B^T)^T=A(A^TA)^{-1^{T}}A^T=A(A^TA)^{-1}A^T=A##
##A^2=AA=(B(B^TB)^{-1}B^T)(B(B^TB)^{-1}B^T)=B(B^TB)^{-1}B^T=A##
 
  • #4
pyroknife said:

Homework Statement


Consider a symmetric n x n matrix ##A## with ##A^2=A##. Is the linear transformation ##T(\vec{x})=A\vec{x}## necessarily the orthogonal projection onto a subspace of ##R^n##?

Homework Equations


Symmetric matrix means ##A=A^T##

An orthogonal projection matrix is given by
##P=A(A^TA)^{-1}A^T## (1)

The Attempt at a Solution



We are given that ##A## is symmetric and idempotent. My procedure is to see if A satisfies equation (1).

Plugging in ##A=A^2## into (1): we get
##A^2(A^2^TA^2)^{-1}*(A^2)^T##

If ##A## is a symmetric matrix, what do you know about it's eigenvectors? What does ##A^2=A## tell you about eigenvalues?
 
  • #5
Dick said:
If ##A## is a symmetric matrix, what do you know about it's eigenvectors? What does ##A^2=A## tell you about eigenvalues?
There are ##n## vectors, one for each eigenvalue, that are mutually orthogonal to one another. The only possible eigenvalues of an idempotent matrix are either 0 or 1.

I am not really understanding how to make the connection between eigentheory and orthogonal projections. Does the fact that the eigenvectors are mutually orthogonal to one another indicate that this linear transformation is an orthogonal projection? If so, where does the eigenvalues come into play?
 
  • #6
pyroknife said:
There are ##n## vectors, one for each eigenvalue, that are mutually orthogonal to one another. The only possible eigenvalues of an idempotent matrix are either 0 or 1.

I am not really understanding how to make the connection between eigentheory and orthogonal projections. Does the fact that the eigenvectors are mutually orthogonal to one another indicate that this linear transformation is an orthogonal projection? If so, where does the eigenvalues come into play?

So if you use the eigenvectors as a basis what does the matrix of A look like? What's the definition of orthogonal projection you are using?
 
  • #7
Dick said:
So if you use the eigenvectors as a basis what does the matrix of A look like? What's the definition of orthogonal projection you are using?
Would the eigenvectors be used as basis for the image of A? If so, then the columns of A are just the eigenvectors.

I am using this definition for orthogonal projection:
Let ##V## be a subspace of ##R^n## and ##\vec{x}## be a vector in ##R^n##.
##\vec{x}## can be decomposed into its perpendicular (perpendicular to V) and parallel components (// to V). The orthogonal projection of x onto V is ##T(\vec{x})=A\vec{x}= the parallel component of \vec{x}##
 
  • #8
pyroknife said:
Would the eigenvectors be used as basis for the image of A? If so, then the columns of A are just the eigenvectors.

I am using this definition for orthogonal projection:
Let ##V## be a subspace of ##R^n## and ##\vec{x}## be a vector in ##R^n##.
##\vec{x}## can be decomposed into its perpendicular (perpendicular to V) and parallel components (// to V). The orthogonal projection of x onto V is ##T(\vec{x})=A\vec{x}= the parallel component of \vec{x}##

I would think it should be getting kind of obvious by now. Yes, the span of some of the eigenvectors is going to be the image of A. Which ones? You have eigenvectors with eigenvalue zero and eigenvectors with eigenvalue one.
 
  • #9
If there are ##n## eigenvectors and A is an n x n matrix, doesn't that mean all of the eigenvectors make up the image of A?
 
  • #10
pyroknife said:
If there are ##n## eigenvectors and A is an n x n matrix, doesn't that mean all of the eigenvectors make up the image of A?

No! If ##v## is an eigenvector with eigenvalue zero, then ##Av=0##. Can ##v## be in the image of ##A## if you have a basis of eigenvectors?
 
  • #11
Dick said:
No! If ##v## is an eigenvector with eigenvalue zero, then ##Av=0##. Can ##v## be in the image of ##A## if you have a basis of eigenvectors?
ahh I see.
The eigenvector corresponding to eigenvalue zero can't be in the image of A.

Thus the span of the eigenvectors corresponding to an eigenvalue 1 will give the image of A.
 
  • #12
pyroknife said:
ahh I see.
The eigenvector corresponding to eigenvalue zero can't be in the image of A.

Thus the span of the eigenvectors corresponding to an eigenvalue 1 will give the image of A.

Ok, so finish it. Any vector ##w## can be written as ##w=a_1 v_1+a_2 v_2+...+a_n v_n## where the ##v_i## are the basis eigenvectors and all have eigenvalues zero or one. If you apply ##A## to that, which part is the parallel part and which part is the perpendicular part? This really shouldn't be all that challenging knowing what you know.
 
  • #13
Dick said:
Ok, so finish it. Any vector ##w## can be written as ##w=a_1 v_1+a_2 v_2+...+a_n v_n## where the ##v_i## are the basis eigenvectors and all have eigenvalues zero or one. If you apply ##A## to that, which part is the parallel part and which part is the perpendicular part? This really shouldn't be all that challenging knowing what you know.
So any vector ##w## in the subspace of ##R^n## can be written as ##w=a_1 v_1+a_2 v_2+...+a_n v_n##
##A w=Aa _1 v_1+A a_2 v_2+...+A a_n v_n##

The parallel parts are the parts such that ##A a_i v_i\neq 0##
and the perpendicular parts are ##A a_i v_i=0##.
Correct?
 
  • #14
pyroknife said:
So any vector ##w## in the subspace of ##R^n## can be written as ##w=a_1 v_1+a_2 v_2+...+a_n v_n##
##A w=Aa _1 v_1+A a_2 v_2+...+A a_n v_n##

The parallel parts are the parts such that ##A a_i v_i\neq 0##
and the perpendicular parts are ##A a_i v_i=0##.
Correct?

If you had left the ##a_i##'s out I'd be tempted to agree. Look, isn't it pretty clear the the ##v##'s with eigenvalue one span the parallel part and the ##v##'s with eigenvalue zero span the perpendicular part?
 
  • #15
Dick said:
If you had left the ##a_i##'s out I'd be tempted to agree. Look, isn't it pretty clear the the ##v##'s with eigenvalue one span the parallel part and the ##v##'s with eigenvalue zero span the perpendicular part? Please say yes.
Yes because the ##v##'s with eigenvalue one dotted with each column of ##A## would be nonzero.
And the ##v##'s with eigenvalue zero dotted with each column of ##A## would be zero, so they span the perpendicular part.
 
  • #16
pyroknife said:
Yes because the ##v##'s with eigenvalue one dotted with each column of ##A## would be nonzero.
And the ##v##'s with eigenvalue zero dotted with each column of ##A## would be zero, so they span the perpendicular part.

Ok, so can you figure out how this fits in with your definition of orthogonal projection?
 
  • #17
Dick said:
Ok, so can you figure out how this fits in with your definition of orthogonal projection?
I think so.

So according to the definition in post #7, ##A\vec{x}## should yield the parallel component of ##\vec{x}## for some ##\vec{x}## in ##R^n##.
We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of 1.
Thus ##A\vec{x}## will yield only the parallel part.
 
  • #18
pyroknife said:
I think so.

So according to the definition in post #7, ##A\vec{x}## should yield the parallel component of ##\vec{x}## for some ##\vec{x}## in ##R^n##.
We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of zero.
Thus ##A\vec{x}## will yield only the parallel part.

You are doing a really good job of convincing me that in spite of making some good statements and saying that you understand it, that you really don't. What's wrong with the statement "We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of zero"? Are you writing from a phone while doing something else?
 
  • #19
Dick said:
You are doing a really good job of convincing me that in spite of making some good statements and saying that you understand it, that you really don't. What's wrong with the statement "We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of zero"?
Yes, I realized I made a mistake and edited right as posted this. Eigenvalue of one*
 
  • #20
pyroknife said:
Yes, I realized I made a mistake and edited right as posted this. Eigenvalue of one*

Alright. Sure I do that to. Now to reassure me, and you write a summary of why an idempotent symmetric matrix is a projection operator?
 
  • #21
Dick said:
Alright. Sure I do that to. Now to reassure me, and you write a summary of why an idempotent symmetric matrix is a projection operator?
Yes, before I do that, I just want to clarify something. In linear algebra, when we say projection, do we typically refer to orthogonal projection?
This website http://mathworld.wolfram.com/ProjectionMatrix.html makes a distinction between an orthogonal projection and a projection, but in other sources, it seems like when "projection" is used, they mean orthogonal projection.
 
  • #22
pyroknife said:
Yes, before I do that, I just want to clarify something. In linear algebra, when we say projection, do we typically refer to orthogonal projection?
This website http://mathworld.wolfram.com/ProjectionMatrix.html makes a distinction between an orthogonal projection and a projection, but in other sources, it seems like when "projection" is used, they mean orthogonal projection.

Projection just means that ##A^2=A##. If you have an inner product space then people may just say projection when they mean orthogonal projection. But they shouldn't.
 
  • #23
A symmetric idempotent matrix has eigenvalues that are either 0 or 1 (properties of an idempotent matrix) and their corresponding eigenvectors are mutually orthogonal to one another (properties of symmetric matrix). The span of the eigenvectors corresponding to ##\lambda = 1## yields ##Image(A)##.

Now let ##V## be the subspace of ##R^n## and ##\vec{x}## be a vector in ##R^n##. ##\vec{x}## can be written as a linear combination of the basis formed by the eigenvectors of ##A##, with corresponding ##\lambda = 1 or 0##.
##\vec{x}=c_1 v_1 + c_2 v_2 + ... c_m v_m## where ##v_i## for i=1,2,...m are the eigenvectors of A corresponding to eigenvalue 1 or 0.
and ##c_i## are some arbitrary constants such that not all of them are zero.

If we apply ##A## to ##\vec{x}##, we obtain:
##A\vec{x}=A(c_1 v_1 + c_2 v_2 + ... c_m v_m)=c_1 (Av_1) + c_2 (Av_2) + ... c_m (Av_m)##.
For the ##v_i##'s corresponding to ##\lambda = 0## we see that ##Av_i=0*v_i=0##. Thus
##A\vec{x}## leaves only the parallel component of ##\vec{x}## and thus ##T(x)=Ax##, where A is a symmetric idempotent matrix, is a linear orthogonal projector operator.
 
  • #24
pyroknife said:
A symmetric idempotent matrix has eigenvalues that are either 0 or 1 (properties of an idempotent matrix) and their corresponding eigenvectors are mutually orthogonal to one another (properties of symmetric matrix). The span of the eigenvectors corresponding to ##\lambda = 1## yields ##Image(A)##.

Now let ##V## be the subspace of ##R^n## and ##\vec{x}## be a vector in ##R^n##. ##\vec{x}## can be written as a linear combination of the basis formed by the eigenvectors of ##A##, with corresponding ##\lambda = 1 or 0##.
##\vec{x}=c_1 v_1 + c_2 v_2 + ... c_m v_m## where ##v_i## for i=1,2,...m are the eigenvectors of A corresponding to eigenvalue 1 or 0.
and ##c_i## are some arbitrary constants such that not all of them are zero.

If we apply ##A## to ##\vec{x}##, we obtain:
##A\vec{x}=A(c_1 v_1 + c_2 v_2 + ... c_m v_m)=c_1 (Av_1) + c_2 (Av_2) + ... c_m (Av_m)##.
For the ##v_i##'s corresponding to ##\lambda = 0## we see that ##Av_i=0*v_i=0##. Thus
##A\vec{x}## leaves only the parallel component of ##\vec{x}## and thus ##T(x)=Ax##, where A is a symmetric idempotent matrix, is a linear orthogonal projector operator.

Good enough for me.
 

1. What is a symmetric matrix?

A symmetric matrix is a square matrix that is equal to its own transpose. This means that the elements above the main diagonal are the same as the elements below the main diagonal. In other words, the matrix is symmetric about its main diagonal.

2. What is an idempotent matrix?

An idempotent matrix is a square matrix that, when multiplied by itself, results in the same matrix. In other words, the matrix remains unchanged after being multiplied by itself.

3. What is a projection matrix?

A projection matrix is a square matrix that, when multiplied by any vector, results in a vector that is in the same direction as the original vector. This means that the matrix projects the original vector onto a subspace.

4. How is a projection matrix related to symmetric and idempotent matrices?

A projection matrix is both symmetric and idempotent. This is because the matrix is equal to its own transpose, making it symmetric, and when multiplied by itself, it results in the same matrix, making it idempotent.

5. What are the applications of symmetric and idempotent matrices in science?

There are many applications of symmetric and idempotent matrices in science, including in statistics, physics, and computer science. These matrices are used in data analysis, linear transformations, and image processing, among others.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
516
  • Calculus and Beyond Homework Help
Replies
3
Views
559
  • Calculus and Beyond Homework Help
Replies
3
Views
264
  • Calculus and Beyond Homework Help
Replies
2
Views
345
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
271
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
Back
Top