# Trying to find the matrix of the projection w.r.t The Spectral Thrm.

1. Nov 30, 2012

### trap101

Verify the spectral thrm for the symmetric matrix, by finding an orthonormal basis of the apporpriate vector space, the change of basis matrix to this basis and the spectral decmoposition.

Well I've found everything else. We started with the matrix A.

A = \begin{bmatrix} 2 & 3 \\ 3 & 2 \end{bmatrix}

Change of basis matrix obtained w.r.t to the eigenvalues: -1, 5:

\begin{bmatrix} -√2/2 & √2/2 \\ √2/2 & √2/2 \end{bmatrix}

the basis from the eigenvalues was:

eigenvalue (-1): ( -√2/2,√2/2) eigenvalue (5): (√2/2,√2/2)

Now I know to obtain the projection you usually use:

$\sum$<x,wi> wi where x is the vector your looking to project from and w is the basis vectors.

I've been at it for an hr and I can't figure out the matrix of the projection in order to write out the decomposition of matirx A. How do I get those coefficients? Please before I jump off a cliff.

Thanks

2. Nov 30, 2012

### HallsofIvy

Staff Emeritus
The matrix projecting Rn onto an m dimensional subspace has "1" as an eigenvalue of multiplicity m and "0" as an eigen value of multiplicity n-m. Further, any vector in that subspace is an eigenvector corresponding to eigenvalue 1 and any vector orthogonal to it is an eigenvector corresponding to eigenvalue 0.

So, to find a matrix, find an orthonormal basis for the m dimensional invariant subspace, then use "Gram-Schmidt" to extend it to an orthonormal basis for all of Rn. In terms of that basis, the matrix, A, is diagonal with m "1"s on the diagonal and the rest "0". To get the matrix in the terms of the "standard basis", construct the matrix U having those orthonormal vectors you found as columns. Then $UAU^{-1}$ will be the matrix giving the same linear transformation but in terms of the stadard basis.

3. Nov 30, 2012

### Dick

Just work it out. If the projection operator is (w.x)w, and the column vectors are w=(w1,w2)^T and x=(x1,x2)^T then (w.x)w=((x1w1+x2w2)w1,(x1w1+x2w2)w2)^T. Looks to me like the matrix of the projection is [[w1w1,w1w2],[w1w2,w2w2]]. If you act with that on (x1,x2)^T you get what you want, yes? In more abstract notation it's just the product ww^(T). ww^Tx=w(w^Tx)=w(w.x), right?

Last edited: Nov 30, 2012
4. Nov 30, 2012

### trap101

Yes, that is the matrix of projection according to the solution: [[w1w1,w1w2],[w1w2,w2w2]], but I still can't get it. What are you using as your x? The only vectors I have are the two eigenvectors, do I put those into the inner product because when I do I don't get that solution

5. Nov 30, 2012

### Dick

I'm not sure I see the problem. You have the orthonormal eigenvectors. Compute the projection matrices for each one. Multiply them by the eigenvalues and sum them. Hence verify the spectral theorem. Seems straightforward. What are the two projection matrices? Well, let's start with the projection matrix for the eigenvector (-√2/2,√2/2). What's that? It's not hard, you have a formula for it.

Last edited: Nov 30, 2012
6. Nov 30, 2012

### trap101

OK so I figured out the matrix, but what do the standard basis vectors e1 and 2 have to do with anything that I did? I though the basis I was using was the eigenbasis, so why did the "standard basis" come into this and allow for my matrix of projection?

7. Nov 30, 2012

### Dick

Your original matrix is in the standard basis and your eigenvectors are in the standard basis. Just stick with that. You can change the basis if you want, it will still work, but you don't have to. The spectral theorem is true in any basis. So just give the the two projection matrices in the standard basis. Please?

Last edited: Nov 30, 2012
8. Mar 21, 2013

### wangchong

It's not called the Spectral Decomposition, Otherwise you would have to write the matrix A as $$\lambda_1 P_1 + \lambda_2 P_2$$ where $$P_1 \& P_2$$ are two projection matrices (onto the eigenspaces) and they commute.

Last edited: Mar 21, 2013