Linear Algebra orthogonal matrix

In summary: Yep, that's exactly what I was saying. The only way for A to satisfy this polynomial is if x=1, which is what we get from the fact that A is similar to the identity matrix.
  • #1
slamminsammya
14
0

Homework Statement


Suppose that A is a real n by n matrix which is orthogonal, symmetric, and positive definite. Prove that A is the identity matrix.

Homework Equations


Orthogonality means [itex]A^t=A^{-1}[/itex], symmetry means [itex]A^t=A[/itex], and positive definiteness means [itex]x^tAx>0[/itex] whenever x is a nonzero vector.

The Attempt at a Solution


Messing around with inner products, trying to show that the matrix [itex]A-I[/itex] is the zero matrix. Help is appreciated.
 
Physics news on Phys.org
  • #2
Welcome to PF, slamminsammya! :smile:

Are you aware of the spectral theorem for symmetric matrices?

What can you say about the eigenvalues of the matrix A?
 
  • #3
Let's prove this is steps. Can you first show that 1 is the only eigenvalue of A?

Edit: ILS was first :cry:
 
  • #4
Yes, I have been able to show that 1 is the only eigenvalue, but I just can't seem to get from there to showing that A therefore fixes each vector. The proof that 1 is the only eigenvalue goes like this:

Suppose Ax=cx for a scalar c. Then A(Ax)=c^2x=x, so that the only possible eigenvalues are 1 or -1. But -1 cannot be an eigenvalue, since if that were so then <x, Ax>=-|x|^2 < 0, contradicting positive definiteness.

From here, wikipedia tells me that from the spectral theorem for symmetric matrices, A must be similar to the identity matrix (since the eigenvalue 1 has multiplicity n). Does this mean that A must be the identity?
 
  • #5
Yes it does... Thanks
 
  • #6
Out of curiosity, it seems like there should be a way of proving this without invoking the spectral theorem for symmetric matrices. Does anyone know a proof?
 
  • #7
I solved this using the Cayley-Hamilton theorem; every square matrix A satisfies it's characteristic polynomial.

You said At = A, because it's symmetric
and you've said At = A-1 for orthogonality.

So combining these results you obtain A = A-1
and therefore A2 = I, where I is the identity matrix.

Your minimal polynomial is therefore m(x) = x2 - 1
The roots of this are your eigenvalues, which are 1 and -1, but we reject -1 as it contradicts positive definiteness. So our eigenvalue is 1, which according to our minimal polynomial has multiplicity 1.
So this means the jordan normal form of A is the Identity matrix, as the jordan form is n 1's along the diagonal.

However this only shows that A is similar to I, however, if we are working with the same basis then it surely means A = I...
 
  • #8
Maybe_Memorie said:
I solved this using the Cayley-Hamilton theorem; every square matrix A satisfies it's characteristic polynomial.

Nice! :smile:

For the record, at this moment I would not know how to do it without an advanced theorem.


slamminsammya said:
From here, wikipedia tells me that from the spectral theorem for symmetric matrices, A must be similar to the identity matrix (since the eigenvalue 1 has multiplicity n). Does this mean that A must be the identity?

Yes it does... Thanks

Maybe_Memorie said:
However this only shows that A is similar to I, however, if we are working with the same basis then it surely means A = I...


It's still not trivial that similarity to the identity matrix implies identity...
 
  • #9
I like Serena said:
Nice! :smile:

And to think last week I didn't understand it, now it's my first approach to most of these types of questions. :wink:

I like Serena said:
It's still not trivial that similarity to the identity matrix implies identity...

Right so we need to use the fact that it's similar to the identity to show that it's equal.

This is all I can come up with;

A is similar to I, so A = P-1IP where P is the transition matrix.
So then, A2 = P-1I2P = P-1IP = A

But I = A2 = P-1IP = A

Thus, A = I. Seems correct anyway. :smile:
 
Last edited:
  • #10
Yep! Looks correct! :smile:

Note:
A is similar to I, so A = P-1I P where P is the transition matrix.
So A = P-1I P = P-1P = I.
 
  • #11
Maybe_Memorie said:
I solved this using the Cayley-Hamilton theorem; every square matrix A satisfies it's characteristic polynomial.

You said At = A, because it's symmetric
and you've said At = A-1 for orthogonality.

So combining these results you obtain A = A-1
and therefore A2 = I, where I is the identity matrix.

Your minimal polynomial is therefore m(x) = x2 - 1
I completely agree with your method here, but I don't think that x^2-1 is the characteristic polynomial of A. The characteristic polynomial is the determinant of A-cI, and for any n by n matrix the characteristic polynomial must therefore be of degree n. Forgive me if I am misunderstanding, but I don't quite see how Cayley Hamilton applies in the way you were treating it.

Maybe this is exactly what you were saying, but wouldn't the conclusion be that the characteristic polynomial must be (x-1)^n, and so the only way for A to satisfy this polynomial is if A=I?
 
  • #12
slamminsammya said:
I completely agree with your method here, but I don't think that x^2-1 is the characteristic polynomial of A. The characteristic polynomial is the determinant of A-cI, and for any n by n matrix the characteristic polynomial must therefore be of degree n. Forgive me if I am misunderstanding, but I don't quite see how Cayley Hamilton applies here.

She said that [itex]x^2-1[/itex] is the minimal polynomial (or correctly: that the minimal polynomial divides [itex]x^2-1[/itex]). This is correct.
 

1. What is an orthogonal matrix?

An orthogonal matrix is a square matrix where the rows and columns are orthogonal unit vectors, meaning they are perpendicular to each other and have a magnitude of 1. This matrix is also known as an orthonormal matrix.

2. How is an orthogonal matrix different from a regular matrix?

An orthogonal matrix has the special property of being orthogonal, meaning its columns and rows are perpendicular to each other. This is not a property of regular matrices. Additionally, the inverse of an orthogonal matrix is equal to its transpose, making it easier to compute.

3. What is the purpose of using orthogonal matrices in linear algebra?

Orthogonal matrices have many useful properties, such as preserving vector lengths and angles, which make them essential in many areas of mathematics and science, including linear algebra. They are also used in many algorithms and computations, as they make calculations easier and more efficient.

4. How can I tell if a matrix is orthogonal?

To determine if a matrix is orthogonal, you can multiply the matrix by its transpose and see if the result is an identity matrix. If the result is an identity matrix, then the original matrix is orthogonal. You can also check if the matrix's columns and rows are perpendicular unit vectors.

5. What are some real-world applications of orthogonal matrices?

Orthogonal matrices have many practical applications, including image and signal processing, data compression, and computer graphics. They are also used in solving systems of linear equations and in least squares regression, which are essential in many scientific fields such as physics, engineering, and economics.

Similar threads

  • Calculus and Beyond Homework Help
Replies
10
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
  • Calculus and Beyond Homework Help
Replies
23
Views
8K
  • Calculus and Beyond Homework Help
Replies
5
Views
881
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
813
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
Back
Top