Show that for a symmetric or normal matrix

In summary, the conversation is discussing ways to show that for a symmetric or normal matrix A, its determinant is equal to the product of its eigenvalues without using Jordan blocks. One method proposed is using the Cayley-Hamilton Theorem and the fact that every square matrix satisfies its characteristic equation. Another approach suggested is using the spectral form of the spectral theorem, which states that every symmetric or normal matrix can be diagonalized with its eigenvalues on the diagonal. This leads to the conclusion that the determinant of A is equal to the product of its eigenvalues.
  • #1
MatthewD
15
0
Is there anyway to show that for a symmetric or normal matrix A, that det(A) = [tex]\prod \lambda_i[/tex] without using Jordan blocks? I want to show this result using maybe unitary equivalence and other similar matrices... any ideas? It's obviously easy with JCF...
 
Physics news on Phys.org
  • #2


I don't know what is the function of JCF in it...it simply follows from the well known Caley-Hamilton Theorem (Every square matrix satisfies its own characteristic equation) and the result holds for any square matrix.
 
  • #3


Do I have to use Cayley-Hamilton? Could I use the fact that A would be orthogonally equivalent to a diagonal matrix by defintion of symmetric, so for some orthogonal matrix Q and diagonal matrix D:
A=Q*DQ
then det(A)=det(Q*DQ)=det(D)
D is diagonal=>det(D)=product of diagonal entries... but how would I show these are the eigenvalues?
if they're the eigenvalues, then i have my result since similar matrices have the same eigenvalues...
 
  • #5


Every symmetric, or normal, matrix, A, can be diagonalized- that is, there exist an invertible matrix P such that [itex]PAP^{-1}= D[/itex] where D is a diagonal matrix having the eigenvalues of A on its diagonal.

Now [itex]det(PAP^{-1})=[/itex][itex] det(P)det(A)det(P)^{-1}= det(A)= det(D)[/itex] and that last is, of course, the product of the eigenvalues.
 

1. What is a symmetric matrix?

A symmetric matrix is a square matrix in which the elements above and below the main diagonal are reflections of each other. In other words, for a matrix A, if A[i,j] = A[j,i] for all i and j, then A is a symmetric matrix.

2. What is a normal matrix?

A normal matrix is a square matrix that commutes with its conjugate transpose. In other words, for a matrix A, if A * A* = A* * A, where A* is the conjugate transpose of A, then A is a normal matrix.

3. How do you show that a matrix is symmetric?

To show that a matrix is symmetric, you need to prove that A[i,j] = A[j,i] for all i and j. This can be done by either manually checking each element or by using the property of symmetry that the transpose of a matrix is equal to itself.

4. How do you show that a matrix is normal?

To show that a matrix is normal, you need to prove that A * A* = A* * A. This can be done by multiplying the matrix A and its conjugate transpose A* and checking if the result is equal to the conjugate transpose of A multiplied by A.

5. What is the significance of a symmetric or normal matrix in linear algebra?

Symmetric and normal matrices have important properties that make them useful in various applications in linear algebra. For example, symmetric matrices have real eigenvalues and can be easily diagonalized, while normal matrices can be diagonalized by unitary matrices. These properties make them valuable in solving systems of linear equations, performing transformations, and other operations.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
594
  • Linear and Abstract Algebra
Replies
8
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
4K
Replies
5
Views
5K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
2K
Replies
4
Views
3K
Back
Top