Eigenvalue Theorem: Proof of Det(A - λI_n)=0

  • Context: Graduate 
  • Thread starter Thread starter jeff1evesque
  • Start date Start date
  • Tags Tags
    Eigenvalue Theorem
Click For Summary
SUMMARY

The Eigenvalue Theorem states that a scalar λ is an eigenvalue of a matrix A if and only if det(A - λI_n) = 0. The proof relies on the relationship between the invertibility of the matrix A - λI_n and its determinant. Specifically, a matrix is non-invertible (singular) if its determinant is zero, which occurs when its columns are linearly dependent, leading to a loss of dimensionality in the transformation represented by the matrix. The determinant can also be interpreted as the product of the eigenvalues of the matrix, reinforcing that a singular matrix must have at least one eigenvalue equal to zero.

PREREQUISITES
  • Understanding of linear algebra concepts such as matrices and determinants.
  • Familiarity with eigenvalues and eigenvectors.
  • Knowledge of linear transformations and their properties.
  • Basic understanding of geometric interpretations of linear transformations.
NEXT STEPS
  • Study the properties of determinants in detail, focusing on their geometric interpretations.
  • Learn about the relationship between eigenvalues and matrix invertibility.
  • Explore the implications of the Rank-Nullity Theorem in linear algebra.
  • Investigate various proofs of the Eigenvalue Theorem for deeper understanding.
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, as well as educators looking to explain the concepts of eigenvalues and determinants effectively.

jeff1evesque
Messages
312
Reaction score
0
Theorem: Let A be in M_n_x_n(F). Then a scalar \lambda is an eigenvalue of A if and only if det(A - \lambda I_n) = 0.

Proof: A scalar lambda is an eigenvalue of A if and only if there exists a nonzero vector v in F^n such that lambda*v, that is (A - \lambda I_n)(v) = 0. By theorem 2.5, this is true if and only if A - \lambda I_n is not invertible (since it's not 1-1 or onto). However, this result is equivalent to the statement that det(A - \lambda I_n) = 0."

Question: Can someone please explain to me how something not being invertible implies that the determinant of such a "thing" is equal to 0?? In particular "By theorem 2.5, this is true if and only if A - \lambda I_n is not invertible (since it's not 1-1 or onto). However, this result is equivalent to the statement that det(A - \lambda I_n) = 0."

Theorem 2.5: Let T be a linear transformation T: V-->W where V and W are vector spaces of equal finite-dimension. Then T is 1-1 <==> T is onto <==> rank(T) = dim(V).

Thanks a lot,JL
 
Last edited:
Physics news on Phys.org
The key question you're asking doesn't really have anything to do with eigenvalues.

A square matrix is invertible if and only if its determinant is nonzero.

Or, equivalently, a square matrix is singular (i.e., non-invertible) if and only if its determinant is zero.

Why does a singular matrix have zero determinant?

The easiest answer is a geometric argument.

If we start with a unit cube, i.e., the set of all n x 1 vectors whose elements are all in the range [0,1], then the matrix maps that cube to a parallelpiped. The volume of that parallelpiped is precisely the absolute value of the determinant of the matrix.

So what happens if the matrix is singular? That means that its columns are not linearly independent, which means that its rank is less than n. The rank is the same as the dimension of the column space, which is the same as the dimension of the parallelpiped that is the image of the unit cube.

Thus for a singular matrix, the parallelpiped is of smaller dimension than the unit cube, meaning one or more of its dimensions got "flattened." Thus its (n-dimensional) volume is zero, thus the determinant of the matrix is zero.

That's not a proof, but I think it's the clearest way to explain what is going on.

To PROVE that a singular matrix has determinant zero, you need to use the properties of the determinant function, specifically these two:

(1) If you multiply a row or column of A by a constant c, then the determinant of the resulting matrix is c*det(A)

(2) If you add a multiple of one row to another row (or a multiple of a column to another column), then the determinant is unchanged.

You can use the fact that a singular matrix does not have linearly independent columns (or rows). Thus one of them can be written as a linear combination of the others. You can then do some manipulations of the type in property (2) which leave the determinant unchanged, such that one of the rows or columns is all zeros. Then by property (1) (with c = 0), the determinant must be 0.
 
A little simpler, I think: det(AB)= det(A)det(B). If A is invertible then det(A)det(A-1)= det(I)= 1. Neither det(A) nor det(A-1) can be 0 since then that product would be 0, not 1. Conversely, if the determinant is 0, A cannot have an inverse since, what ever A-1 might be, then we would have det(A)det(A-1)= 0(det(A-1))= 0, not 1.
 
HallsofIvy said:
A little simpler, I think: det(AB)= det(A)det(B). If A is invertible then det(A)det(A-1)= det(I)= 1. Neither det(A) nor det(A-1) can be 0 since then that product would be 0, not 1. Conversely, if the determinant is 0, A cannot have an inverse since, what ever A-1 might be, then we would have det(A)det(A-1)= 0(det(A-1))= 0, not 1.

Yes, that's much simpler!

There's another proof that is a bit too circular to be used in this case, but what the heck, this isn't the homework forum and I find this one easy to visualize (once you get a feel for what eigenvalues are).

The proof uses the fact the determinant of a matrix is the product of its eigenvalues.

A matrix is singular if and only if

Ax = 0 = 0x for some nonzero x, which means precisely that 0 is an eigenvalue of A (and x is an eigenvector).

Thus a matrix is singular if and only if 0 is one of its eigenvalues, which is true if and only if the PRODUCT of eigenvalues (which is the determinant) is zero.
 
jbunniii said:
Yes, that's much simpler!

There's another proof that is a bit too circular to be used in this case, but what the heck, this isn't the homework forum and I find this one easy to visualize (once you get a feel for what eigenvalues are).

The proof uses the fact the determinant of a matrix is the product of its eigenvalues.

A matrix is singular if and only if

Ax = 0 = 0x for some nonzero x, which means precisely that 0 is an eigenvalue of A (and x is an eigenvector).

Thus a matrix is singular if and only if 0 is one of its eigenvalues, which is true if and only if the PRODUCT of eigenvalues (which is the determinant) is zero.

Looking at the above briefly, I notice that HallsOfIvy has a very short-clear proof. For now I will take this, but I like the fact we can visualize this geometrically.

Thanks again,


JL
 
I believe the easier is to say: IF A is a singular matrix (non invertible) its determinant must be 0, since the determinant of matrix A is the same as the product of all the eigenvalues of A, then A can only be invertible if that product is 0, and the only way a product turns 0 is if one of the numbers is 0 that is if one of the eigenvalues is 0
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K