# Linear Algebra: Question about Inverse of Diagonal Matrices

Master1022

## Homework Statement

Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?

Mentor
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.

Master1022
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.
Thank you for your response. So what are the other conditions for orthogonality? Even if you normalised the column vectors, it still wouldn't be equal to the reciprocal.

Nevermind, I realized if you normalised the vectors, it becomes the identity matrix.

Last edited:
Gold Member

## Homework Statement

Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal [should say orthonormal] and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?

From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##

PeroK
Master1022
From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?

Homework Helper
Gold Member
2021 Award
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?

I thought you had proved this for yourself.