Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigenvalue decomposition

  1. Dec 7, 2009 #1
    If a n x n matrix A has an eigenvalue decomposition, so if it has n different eigenvalues, by the way, is it correct that a n x n matrix that doesn't have n different eigenvalues can't be decomposed? Are the more situations in which it can't be decomposed? Why can't I just put the same eigenvalue more than once in the diagonal matrix D, perhaps P is not anymore invertible?


    Ax = PDP^-1 * x
    The vector x holds the weights for combining the vectors in the column space of P^-1,
    so P^-1 * x are the coordinates of the vector x if unit vectors are used as base and x itself are the coordinates of the vector x if the column space of P^-1 is used as base.
    D scales every component of this transformed vector, and finally the coordinates of this deformed vector are expressed again in terms of P^-1.
    If a matrix A doesn't have an eigenvalue decomposition, does this imply that the vector x isn't just scaled, but is scaled/rotated/translated?
  2. jcsd
  3. Dec 15, 2009 #2


    User Avatar
    Science Advisor
    Gold Member

    Since no-one has replied in over a week I'll take a swing at your questions.

    First, your post has so many questions that I will not attempt to answer them all.

    By "eigenvalue decomposition" I'm assuming you mean that the nxn matrix is similar to a diagonal matrix, so that [tex]A = P D P^{-1}[/tex], where [tex]D[/tex] is diagonal. In order to do this, you form [tex]P[/tex] out of the eigenvectors of [tex]A[/tex]. Hence, you need n linearly independent eigenvectors in order for [tex]P^{-1}[/tex] to exist. If [tex]A[/tex] has n distinct eigenvalues, then you are guaranteed to have n linearly independent eigenvectors, so this case is easy. Another easy case is when [tex]A[/tex] is Hermitian (equal to its conjugate transpose), in which case you are also guaranteed to be able to diagonalize, and furthermore [tex]P[/tex] is unitary (I am assuming you are in a complex vector space, so eigenvalues are allowed to be complex, etc.). Otherwise, in general you cannot expect a matrix to be diagonalizable.

    If it isn't, then you have a couple of options. First, you can find a [tex]P[/tex] that makes [tex]A[/tex] as diagonal as possible - this is called the Jordan form, and can be useful at times. Another option is to abandon similarity transformations altogether, and use the singular value decomposition which is another matrix decomposition that can be useful in practice.

    I hope this helps.

    Last edited: Dec 15, 2009
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook