# Theorem about symmetric matrices?

1. Jan 15, 2005

### PBRMEASAP

If you have a symmetric, nonsingular matrix $$A$$, is it always possible to find a matrix $$B$$ such that

$$B^T A B = 1$$,

where $$1$$ is the identity?

2. Jan 15, 2005

### Hurkyl

Staff Emeritus
I think there's a theorem that says symmetric matrices are diagonalizable. That should help.

3. Jan 15, 2005

### PBRMEASAP

Okay, I think I see what you mean.

Since $$A$$ can be written as $$P^T D P$$, where $$P$$ is orthogonal and $$D$$ is diagonal, then my problem reduces to finding a matrix $$C = PB \ \mbox{such that} \ C^T D C = 1$$, which should not be a problem. Do I have the right idea?

thanks,
PBR

4. Jan 15, 2005

### Hurkyl

Staff Emeritus
Sounds good.

5. Jan 15, 2005

### PBRMEASAP

Thanks for your help

6. Jan 16, 2005

### mathwonk

extract from my book on linear algebra posted here:

The “spectral theorem” (symmetric matrices are diagonalizable)
Theorem: If A is symmetric, then Rn has a basis of eigenvectors for A.
proof: The real valued function f(x) = Ax.x has a maximum on the unit sphere in Rn, at which point the gradient vector of f is zero on the tangent space to the sphere, i.e. is perpendicular to the tangent space at that point. But the tangent space at x is the subspace of vectors perpendicular to x, and the gradient of f at x is the vector 2Ax. Hence Ax is also perpendicular to the tangent space at x, i.e. either Ax is parallel to x or Ax = 0, i.e. x is an eigenvector for A. That gives one eigenvector for A.

Now restrict A to the tangent space (through the origin) to the sphere at x. I.e. let v be a tangent vector, so that v.x = 0. Then Av.x = v.Ax = v.cx for some c. so this is also zero, and hence A preserves this tangent space. Now A still has the property Av.x = v.Ax on this subspace, so A the restriction of A has an eigenvector. Since we are in finite dimensions, by repeating at most n times, A has a basis of eigenvectors. (Note that although the subspace has no natural representation as Rn-1, the argument above for producing an eigenvector was coordinate - free, and depended only the property that Av.v = v.Av, which is still true on the subspace.) QED.

Corollary (of proof): There is actually a basis of mutually perpendicular eigenvectors for a symmetric n by n matrix.

since a matrix whose columns are orthonormal vectors has invrse equalk to its iown transpose, this also diagonalizes the quadratic form, i.e. gives a diagonal matrix under the operation m goes to (P^t)MP.

7. Jan 17, 2005

### PBRMEASAP

That is a very slick proof, Mathwonk. If I recall, it is similar to the one given in Apostol's Calculus. Who's the author of your book, by the way?

I guess I had forgotten that zero is a legitimate eigenvalue for a matrix. So am I correct in understanding that the spectral theorem applies even to singular symmetric matrices? Even so, I believe requirement of being non-singular was necessary to make the final jump in answering my original question, since I don't think a (not necessarily orthogonal) transformation $$B^T D B = 1$$ can be found for a diagonal matrix $$D$$ with some zero diagonal elements.

Thanks for posting that proof!

8. Feb 12, 2005

### mathwonk

roy smith is the author of the book. it is only 15 pages long and includes proofs of existence of rational canonical form, jordan normal form, and the spectral theorem. it can be dowloaded from his webpage at the math dept of the university of georgia. he probably learned that proof from some standard source, as it is well known. it occurs for example in lang's analysis I book.

http://www.math.uga.edu/~roy/

Last edited: Feb 12, 2005