# Homework Help: Symmetric and idempotent matrix = Projection matrix

1. Apr 12, 2015

### pyroknife

1. The problem statement, all variables and given/known data
Consider a symmetric n x n matrix $A$ with $A^2=A$. Is the linear transformation $T(\vec{x})=A\vec{x}$ necessarily the orthogonal projection onto a subspace of $R^n$?

2. Relevant equations
Symmetric matrix means $A=A^T$

An orthogonal projection matrix is given by
$P=A(A^TA)^{-1}A^T$ (1)

3. The attempt at a solution

We are given that $A$ is symmetric and idempotent. My procedure is to see if A satisfies equation (1).

Plugging in $A=A^2$ into (1): we get
$A^2(A^2^TA^2)^{-1}*(A^2)^T$

Last edited by a moderator: Apr 13, 2015
2. Apr 12, 2015

### pyroknife

I feel like I accomplished nothing with my solution procedure.

3. Apr 13, 2015

### pyroknife

Just attempted it again:

P is an orthogonal projection matrix IFF it is symmetric and idempotent.
Let A be the orthogonal projection matrix. Thus $A$ can be written as such:
$A=B(B^TB)^{-1}B^T$
for a matrix $B$ whose columns vectors form a basis for the column space of A.

$A^T=(B(B^TB)^{-1}B^T)^T=A(A^TA)^{-1^{T}}A^T=A(A^TA)^{-1}A^T=A$
$A^2=AA=(B(B^TB)^{-1}B^T)(B(B^TB)^{-1}B^T)=B(B^TB)^{-1}B^T=A$

4. Apr 13, 2015

### Dick

If $A$ is a symmetric matrix, what do you know about it's eigenvectors? What does $A^2=A$ tell you about eigenvalues?

5. Apr 13, 2015

### pyroknife

There are $n$ vectors, one for each eigenvalue, that are mutually orthogonal to one another. The only possible eigenvalues of an idempotent matrix are either 0 or 1.

I am not really understanding how to make the connection between eigentheory and orthogonal projections. Does the fact that the eigenvectors are mutually orthogonal to one another indicate that this linear transformation is an orthogonal projection? If so, where does the eigenvalues come into play?

6. Apr 13, 2015

### Dick

So if you use the eigenvectors as a basis what does the matrix of A look like? What's the definition of orthogonal projection you are using?

7. Apr 13, 2015

### pyroknife

Would the eigenvectors be used as basis for the image of A? If so, then the columns of A are just the eigenvectors.

I am using this definition for orthogonal projection:
Let $V$ be a subspace of $R^n$ and $\vec{x}$ be a vector in $R^n$.
$\vec{x}$ can be decomposed into its perpendicular (perpendicular to V) and parallel components (// to V). The orthogonal projection of x onto V is $T(\vec{x})=A\vec{x}= the parallel component of \vec{x}$

8. Apr 13, 2015

### Dick

I would think it should be getting kind of obvious by now. Yes, the span of some of the eigenvectors is going to be the image of A. Which ones? You have eigenvectors with eigenvalue zero and eigenvectors with eigenvalue one.

9. Apr 13, 2015

### pyroknife

If there are $n$ eigenvectors and A is an n x n matrix, doesn't that mean all of the eigenvectors make up the image of A?

10. Apr 13, 2015

### Dick

No! If $v$ is an eigenvector with eigenvalue zero, then $Av=0$. Can $v$ be in the image of $A$ if you have a basis of eigenvectors?

11. Apr 13, 2015

### pyroknife

ahh I see.
The eigenvector corresponding to eigenvalue zero can't be in the image of A.

Thus the span of the eigenvectors corresponding to an eigenvalue 1 will give the image of A.

12. Apr 13, 2015

### Dick

Ok, so finish it. Any vector $w$ can be written as $w=a_1 v_1+a_2 v_2+...+a_n v_n$ where the $v_i$ are the basis eigenvectors and all have eigenvalues zero or one. If you apply $A$ to that, which part is the parallel part and which part is the perpendicular part? This really shouldn't be all that challenging knowing what you know.

13. Apr 13, 2015

### pyroknife

So any vector $w$ in the subspace of $R^n$ can be written as $w=a_1 v_1+a_2 v_2+...+a_n v_n$
$A w=Aa _1 v_1+A a_2 v_2+...+A a_n v_n$

The parallel parts are the parts such that $A a_i v_i\neq 0$
and the perpendicular parts are $A a_i v_i=0$.
Correct?

14. Apr 13, 2015

### Dick

If you had left the $a_i$'s out I'd be tempted to agree. Look, isn't it pretty clear the the $v$'s with eigenvalue one span the parallel part and the $v$'s with eigenvalue zero span the perpendicular part?

15. Apr 13, 2015

### pyroknife

Yes because the $v$'s with eigenvalue one dotted with each column of $A$ would be nonzero.
And the $v$'s with eigenvalue zero dotted with each column of $A$ would be zero, so they span the perpendicular part.

16. Apr 13, 2015

### Dick

Ok, so can you figure out how this fits in with your definition of orthogonal projection?

17. Apr 13, 2015

### pyroknife

I think so.

So according to the definition in post #7, $A\vec{x}$ should yield the parallel component of $\vec{x}$ for some $\vec{x}$ in $R^n$.
We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of 1.
Thus $A\vec{x}$ will yield only the parallel part.

18. Apr 13, 2015

### Dick

You are doing a really good job of convincing me that in spite of making some good statements and saying that you understand it, that you really don't. What's wrong with the statement "We have previously shown that the image of A is spanned by the eigenvectors corresponding to an eigenvalue of zero"? Are you writing from a phone while doing something else?

19. Apr 13, 2015

### pyroknife

Yes, I realized I made a mistake and edited right as posted this. Eigenvalue of one*

20. Apr 13, 2015

### Dick

Alright. Sure I do that to. Now to reassure me, and you write a summary of why an idempotent symmetric matrix is a projection operator?

21. Apr 13, 2015

### pyroknife

Yes, before I do that, I just want to clarify something. In linear algebra, when we say projection, do we typically refer to orthogonal projection?
This website http://mathworld.wolfram.com/ProjectionMatrix.html makes a distinction between an orthogonal projection and a projection, but in other sources, it seems like when "projection" is used, they mean orthogonal projection.

22. Apr 13, 2015

### Dick

Projection just means that $A^2=A$. If you have an inner product space then people may just say projection when they mean orthogonal projection. But they shouldn't.

23. Apr 13, 2015

### pyroknife

A symmetric idempotent matrix has eigenvalues that are either 0 or 1 (properties of an idempotent matrix) and their corresponding eigenvectors are mutually orthogonal to one another (properties of symmetric matrix). The span of the eigenvectors corresponding to $\lambda = 1$ yields $Image(A)$.

Now let $V$ be the subspace of $R^n$ and $\vec{x}$ be a vector in $R^n$. $\vec{x}$ can be written as a linear combination of the basis formed by the eigenvectors of $A$, with corresponding $\lambda = 1 or 0$.
$\vec{x}=c_1 v_1 + c_2 v_2 + .... c_m v_m$ where $v_i$ for i=1,2,...m are the eigenvectors of A corresponding to eigenvalue 1 or 0.
and $c_i$ are some arbitrary constants such that not all of them are zero.

If we apply $A$ to $\vec{x}$, we obtain:
$A\vec{x}=A(c_1 v_1 + c_2 v_2 + .... c_m v_m)=c_1 (Av_1) + c_2 (Av_2) + .... c_m (Av_m)$.
For the $v_i$'s corresponding to $\lambda = 0$ we see that $Av_i=0*v_i=0$. Thus
$A\vec{x}$ leaves only the parallel component of $\vec{x}$ and thus $T(x)=Ax$, where A is a symmetric idempotent matrix, is a linear orthogonal projector operator.

24. Apr 13, 2015

### Dick

Good enough for me.