Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvalue theorem

  1. May 21, 2009 #1
    Theorem: Let A be in [tex]M_n_x_n(F)[/tex]. Then a scalar [tex]\lambda[/tex] is an eigenvalue of A if and only if [tex]det(A - \lambda I_n) = 0[/tex].

    Proof: A scalar lambda is an eigenvalue of A if and only if there exists a nonzero vector v in F^n such that lambda*v, that is (A - \lambda I_n)(v) = 0. By theorem 2.5, this is true if and only if A - \lambda I_n is not invertible (since it's not 1-1 or onto). However, this result is equivalent to the statement that det(A - \lambda I_n) = 0."

    Question: Can someone please explain to me how something not being invertible implies that the determinant of such a "thing" is equal to 0?? In particular "By theorem 2.5, this is true if and only if A - \lambda I_n is not invertible (since it's not 1-1 or onto). However, this result is equivalent to the statement that det(A - \lambda I_n) = 0."

    Theorem 2.5: Let T be a linear transformation T: V-->W where V and W are vector spaces of equal finite-dimension. Then T is 1-1 <==> T is onto <==> rank(T) = dim(V).

    Thanks a lot,


    JL
     
    Last edited: May 21, 2009
  2. jcsd
  3. May 21, 2009 #2

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The key question you're asking doesn't really have anything to do with eigenvalues.

    A square matrix is invertible if and only if its determinant is nonzero.

    Or, equivalently, a square matrix is singular (i.e., non-invertible) if and only if its determinant is zero.

    Why does a singular matrix have zero determinant?

    The easiest answer is a geometric argument.

    If we start with a unit cube, i.e., the set of all n x 1 vectors whose elements are all in the range [0,1], then the matrix maps that cube to a parallelpiped. The volume of that parallelpiped is precisely the absolute value of the determinant of the matrix.

    So what happens if the matrix is singular? That means that its columns are not linearly independent, which means that its rank is less than n. The rank is the same as the dimension of the column space, which is the same as the dimension of the parallelpiped that is the image of the unit cube.

    Thus for a singular matrix, the parallelpiped is of smaller dimension than the unit cube, meaning one or more of its dimensions got "flattened." Thus its (n-dimensional) volume is zero, thus the determinant of the matrix is zero.

    That's not a proof, but I think it's the clearest way to explain what is going on.

    To PROVE that a singular matrix has determinant zero, you need to use the properties of the determinant function, specifically these two:

    (1) If you multiply a row or column of A by a constant c, then the determinant of the resulting matrix is c*det(A)

    (2) If you add a multiple of one row to another row (or a multiple of a column to another column), then the determinant is unchanged.

    You can use the fact that a singular matrix does not have linearly independent columns (or rows). Thus one of them can be written as a linear combination of the others. You can then do some manipulations of the type in property (2) which leave the determinant unchanged, such that one of the rows or columns is all zeros. Then by property (1) (with c = 0), the determinant must be 0.
     
  4. May 21, 2009 #3

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    A little simpler, I think: det(AB)= det(A)det(B). If A is invertible then det(A)det(A-1)= det(I)= 1. Neither det(A) nor det(A-1) can be 0 since then that product would be 0, not 1. Conversely, if the determinant is 0, A cannot have an inverse since, what ever A-1 might be, then we would have det(A)det(A-1)= 0(det(A-1))= 0, not 1.
     
  5. May 21, 2009 #4

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Yes, that's much simpler!

    There's another proof that is a bit too circular to be used in this case, but what the heck, this isn't the homework forum and I find this one easy to visualize (once you get a feel for what eigenvalues are).

    The proof uses the fact the determinant of a matrix is the product of its eigenvalues.

    A matrix is singular if and only if

    Ax = 0 = 0x for some nonzero x, which means precisely that 0 is an eigenvalue of A (and x is an eigenvector).

    Thus a matrix is singular if and only if 0 is one of its eigenvalues, which is true if and only if the PRODUCT of eigenvalues (which is the determinant) is zero.
     
  6. May 21, 2009 #5
    Looking at the above briefly, I notice that HallsOfIvy has a very short-clear proof. For now I will take this, but I like the fact we can visualize this geometrically.

    Thanks again,


    JL
     
  7. Dec 20, 2009 #6
    I believe the easier is to say: IF A is a singular matrix (non invertible) its determinant must be 0, since the determinant of matrix A is the same as the product of all the eigenvalues of A, then A can only be invertible if that product is 0, and the only way a product turns 0 is if one of the numbers is 0 that is if one of the eigenvalues is 0
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Eigenvalue theorem
  1. Is this a theorem? (Replies: 10)

  2. Is this a theorem? (Replies: 8)

Loading...