I Spectral theorem for Hermitian matrices-- special cases

  • Thread starter Thread starter nomadreid
  • Start date Start date
  • Tags Tags
    Diagonalization
nomadreid
Gold Member
Messages
1,754
Reaction score
243
TL;DR
I am not sure whether, for a finite Hermitian matrix M, a spectral decomposition M=PDP^-1 includes that (a) the P can be required to be unitary, and (b) whether the diagonal D can be required to be the classic upper-&-lower triangular kind
I have a proof in front of me that shows that for a normal matrix M, the spectral decomposition exists with
M=PDP-1
where P is an invertible matrix and D a matrix that can be represented by the sum over the dimension of the matrix of the eigenvalues times the outer products of the corresponding basis vectors vi
of an orthonormal basis, i.e.,
Σλi |vi><vi|

What the theorem does not say is whether, for Hermitian matrices (which are normal) M, one can require that the P be unitary. I seem to recall reading that it can but cannot find where I read that, and do not know how to prove it myself.

Also, the classic "diagonal matrix" is one with all the off-diagonal entries zero, but in this theorem, diagonal is just having the form above (with the summation), which need not have all zero off-diagonal entries. So I do not know whether the theorem would be valid if one required D to be a classic diagonal matrix, as I do not know how I would get from the above outer sum to a classic diagonal matrix.

{On less solid ground: my intuition (always a bad guide) suggests that one could rotate the orthonormal basis to a basis with (1,0,0...), (0,1,0...,)(0,0,1...)...., but that seems too reliant on a spatial intuition of perpendicularity. Nagging at me is also the idea that there are Hermitian operators that can't agree on a basis, so that it seems unlikely that they could all be reduced to the same basis even if they are separated by
P_P-1. My intuition on this point is fishing around without any rigor, so anything shooting down this lead balloon is also welcome.]

Thanks for any help.
 
Physics news on Phys.org
Excellent. Thank you, PeroK.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K