• Support PF! Buy your school textbooks, materials and every day products Here!

Linear Algebra: Question about Inverse of Diagonal Matrices

  • Thread starter Master1022
  • Start date
  • #1
204
23

Homework Statement


Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?
 

Answers and Replies

  • #2
DrClaude
Mentor
7,163
3,308
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.
 
  • #3
204
23
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.
Thank you for your response. So what are the other conditions for orthogonality? Even if you normalised the column vectors, it still wouldn't be equal to the reciprocal.

Nevermind, I realised if you normalised the vectors, it becomes the identity matrix.
 
Last edited:
  • #4
StoneTemplePython
Science Advisor
Gold Member
2019 Award
1,145
546

Homework Statement


Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal [should say orthonormal] and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?
From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##
 
  • #5
204
23
From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?
 
  • #6
PeroK
Science Advisor
Homework Helper
Insights Author
Gold Member
12,377
5,157
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?
I thought you had proved this for yourself.
 
  • #7
StoneTemplePython
Science Advisor
Gold Member
2019 Award
1,145
546
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?
so, quoting this:

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.
you have ##M## or in my writeup ##U## is orthogonal. Suppose it is square and each column vector is mutually orthogonal.

Then ##U^T U = D## where ##D## is some diagonal matrix... but based on your quote, you want ##D = I##, so what does that tell you about ##\big \Vert \mathbf u_k\big \Vert_2^2## which are the entries in the kth diagonal spot of ##D##?
 

Related Threads on Linear Algebra: Question about Inverse of Diagonal Matrices

  • Last Post
Replies
2
Views
5K
Replies
2
Views
553
Replies
2
Views
7K
Replies
1
Views
1K
Replies
9
Views
1K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
0
Views
2K
Replies
11
Views
3K
  • Last Post
Replies
4
Views
1K
Replies
1
Views
2K
Top