How can I prove the invertibility of a diagonalizable matrix?

• randommacuser
In summary, to prove that an nxn matrix A is diagonalizable, invertible, or has rank n, it is necessary to show that its columns span Rn. This can be done by reducing the matrix and checking if the rank is equal to n. Additionally, for an invertible matrix A, the determinant must not be equal to zero. However, this does not apply to diagonal matrices, as their determinant can be equal to zero if any of the eigenvalues are zero. The link between diagonalizability and invertibility is that a matrix is invertible if and only if all its eigenvalues are nonzero, and this can be generalized further.
randommacuser
How do I prove that if an nxn matrix A is diagonalizable (is invertible, has rank n, etc.), its columns span Rn?

To prove that columns span Rn, you just need to find the basis of the column-space: reduce the matrix and see if the rank = n, i.e. there has to be n linearly indep. columns, which is very easy to see if the matrix is reduced. Then the basis is the columns of the original matrix, not the columns of the reduced form.
Also, for invertible matrix A, det(A) != 0 (not equal to zero).
Don't know how much that helps.

If A is a matrix with rank n, its column vectors are independent and form a basis, so every vector can be written as a lin. comb. of them. In other words, these vectors span the entire space.

Equivalently, recall that Ax (x some vector in Rn) is a linear combination of the column vectors of A, so Ax=b has a solution if and only if b is in the span of the the column vectors of A.
If A is invertible, the equation $Ax=b$ has a solution for every vector b (namely $x=A^{-1}b$) so the column vectors span Rn.

A matrix doesn't have to be invertible to be diagonalizable...

Hurkyl said:
A matrix doesn't have to be invertible to be diagonalizable...
Ohh, yeah
Thanks for pointing that out! At first I was confused by your remark. I had to prove it to myself.
But here's the thing that now bothers me:
given:
A is a matrix, Q - matrix of its eigenvectors, Q' - inverse of Q, D - matrix containing eigenvalues of A by diagonal.
So A is diagonilazable iff A = QDQ'.

Then let's say I take determinant of both sides like so:
det(A) = det(QDQ')
Then det(A) = det(D), since det(QQ') = 1.
But if, as you pointed out, A is not invertible necessarily, then det(A) = 0 but det(D) cannot be = 0!
What am I missing here? It is not supposed to contradict each other

Why can't the determinant of a Diagonal matrix be zero?

Last edited:
matt grime said:
Why can't the determinant of a Diagonal matrix be zero?
this
although I see your point, because eigenvalue can be 0, while eigenvector cannot. I am still confused though, because it is defined exactly the same way in my text-book as well, i.e. Q has to be invertible.

Is the 0 matrix diagonalizable? Invertible? Once you've answered these, can you guess at the link (and even better, you can try to prove your conjecture!) between diagonalizability and invertibility?

EvLer said:
this
although I see your point, because eigenvalue can be 0, while eigenvector cannot. I am still confused though, because it is defined exactly the same way in my text-book as well, i.e. Q has to be invertible.

So what, why is this confusing, what's that link got to do with what I wrote? At no point does it state either A or D are invertible, indeed A is invertible if and only if D is.

matt grime said:
So what, why is this confusing, what's that link got to do with what I wrote? At no point does it state either A or D are invertible, indeed A is invertible if and only if D is.
Was working on conjecture as Data suggested. Came to same conclusion (yooo-hooo, it worked!)
Thanks for confirming it

P.S. Oh, anwering your question: I think I've read it wrong, I thought you said that Q does not have to be invertible. Anyway, sorry for confusion and thanks much again.

Last edited:
or, equivalently you can say that a diagonalizable matrix is invertible iff all its eigenvalues are nonzero. This can, in fact, be generalized too.

Last edited:

1. What is an invertible matrix?

An invertible matrix is a square matrix that has a unique inverse matrix, meaning it can be multiplied with another matrix to give the identity matrix. The inverse of a matrix is denoted as A-1.

2. How do I prove that a matrix is invertible?

To prove that a matrix is invertible, you can use various methods such as the determinant method, the row reduction method, or the adjugate method. These methods involve manipulating the matrix to show that it has an inverse.

3. Can every square matrix be inverted?

No, not every square matrix can be inverted. A square matrix can only be inverted if its determinant is non-zero. If the determinant is zero, the matrix is said to be singular and does not have an inverse.

4. Why is it important to have invertible matrices?

Invertible matrices are important in various mathematical applications, such as solving systems of equations, finding the inverse of a linear transformation, and diagonalizing matrices. They also have practical applications in fields such as engineering, computer science, and economics.

5. Is the inverse of an invertible matrix unique?

Yes, the inverse of an invertible matrix is unique. This means that there is only one possible inverse matrix for a given invertible matrix. It is also important to note that the inverse of a matrix is not the same as its transpose.

• Linear and Abstract Algebra
Replies
3
Views
1K
• Linear and Abstract Algebra
Replies
4
Views
2K
• Linear and Abstract Algebra
Replies
1
Views
776
• Linear and Abstract Algebra
Replies
11
Views
2K
• Linear and Abstract Algebra
Replies
2
Views
858
• Linear and Abstract Algebra
Replies
2
Views
2K
• Linear and Abstract Algebra
Replies
34
Views
2K
• Linear and Abstract Algebra
Replies
9
Views
4K
• Linear and Abstract Algebra
Replies
4
Views
1K
• Linear and Abstract Algebra
Replies
6
Views
1K