Linear Algebra: Question about Inverse of Diagonal Matrices

In summary, we discussed how if a matrix is orthogonal, its transpose is its inverse. However, this does not apply to diagonalized matrices, as their inverse is not equal to their transpose. The inverse of a diagonalized matrix is instead the diagonalized matrix with its non-zero entries reciprocated. This is due to the fact that the columns of a matrix must not only be orthogonal, but also unit vectors for the matrix to be orthogonal. Additionally, we looked at the confusion between orthogonal and orthonormal matrices and clarified that an orthogonal matrix does not necessarily have normalized column vectors.
  • #1
Master1022
611
117

Homework Statement


Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?
 
Physics news on Phys.org
  • #2
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.
 
  • #3
DrClaude said:
Simply because the columns of a matrix are orthogonal doesn't mean that the matrix is orthogonal. The columns also have to be unit vectors.
Thank you for your response. So what are the other conditions for orthogonality? Even if you normalised the column vectors, it still wouldn't be equal to the reciprocal.

Nevermind, I realized if you normalised the vectors, it becomes the identity matrix.
 
Last edited:
  • #4
Master1022 said:

Homework Statement


Not for homework, but just for understanding.

So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

Using that knowledge for a diagonalised matrix (with eigenvalues), its column vectors are all mutually orthogonal [should say orthonormal] and thus you would assume that its inverse is its transpose...

However, that is wrong and the inverse is actually just the diagonalised matrix with its non-zero entries reciprocated. I understand the proof, but fail to see why the above logic wouldn't apply (beyond that it wouldn't multiply to the identity matrix)

Can anyone help me see where I have gone wrong in trying to combine the two theorems?

From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##
 
  • Like
Likes PeroK
  • #5
StoneTemplePython said:
From what I can tell the issues is you haven't carefully written out the math of what you think you're saying -- you're just writing sentences and in the process confusing the underlying matrix/operator with a similarity transform.

##A = UDU^T##
##A^{-1} = \big(UDU^T\big)^{-1} = \big(U^T\big)^{-1}D^{-1}U^{-1} = UD^{-1} U^T##
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?
 
  • #6
Master1022 said:
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?

I thought you had proved this for yourself.
 
  • #7
Master1022 said:
Thanks for the response. I understand the maths that you have written, but perhaps the area I have gone wrong is this orthogonal / orthonormal confusion. Does an orthogonal matrix need to have normalised column vectors?

so, quoting this:

Master1022 said:
So we know that if a matrix (M) is orthogonal, then its transpose is its inverse.

you have ##M## or in my writeup ##U## is orthogonal. Suppose it is square and each column vector is mutually orthogonal.

Then ##U^T U = D## where ##D## is some diagonal matrix... but based on your quote, you want ##D = I##, so what does that tell you about ##\big \Vert \mathbf u_k\big \Vert_2^2## which are the entries in the kth diagonal spot of ##D##?
 

FAQ: Linear Algebra: Question about Inverse of Diagonal Matrices

1. What is a diagonal matrix?

A diagonal matrix is a special type of square matrix where all the non-diagonal elements are equal to zero. The diagonal elements can be any real numbers or complex numbers.

2. How do you find the inverse of a diagonal matrix?

To find the inverse of a diagonal matrix, you simply need to take the reciprocal of each diagonal element and place them in a diagonal matrix of the same size.

3. What is the significance of the inverse of a diagonal matrix?

The inverse of a diagonal matrix is important because it allows us to solve systems of linear equations involving diagonal matrices. It also has applications in fields such as engineering, physics, and economics.

4. Can a diagonal matrix have more than one inverse?

No, a diagonal matrix can only have one inverse. This is because the inverse of a matrix is unique if it exists.

5. How do you determine if a diagonal matrix is invertible?

A diagonal matrix is invertible if and only if none of its diagonal elements are equal to zero. If any of the diagonal elements are equal to zero, then the matrix is not invertible.

Similar threads

Replies
25
Views
3K
Replies
11
Views
3K
Replies
8
Views
2K
Replies
5
Views
2K
Replies
8
Views
2K
Replies
2
Views
2K
Replies
1
Views
5K
Back
Top