Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Eigen values

  1. Jul 23, 2007 #1
    I would like to get the eigen values of a sparse matrix

    of form

    [ a, b, c, d, e;
    1, 0, 0, 0, 0;
    0, 1, 0, 0, 0;
    0, 0, 1, 0, 0;
    0, 0, 0, 1, 0]

    in matlab/octave I use the following to generate the matrix

    % filename nfactor.m written by Jeff Sadowski
    % nfactor finds the factors for an nth degree polynomial
    % if you have an equation a(x^n) + b(x^(n-1)) + c(x^(n-2)) + ...
    % use nfactor like so nfactor([a,b,c,...])
    % example: equation=x^2 + 2*x + 1
    % octave:1> nfactor([1,2,1])
    % ans =
    % -1
    % -1
    function ans=nfactor(VECTOR) vsize=size(VECTOR);

    However I need higher precision than this more accurately I need a resultant formula for each eigen value. I was hoping to use something like the householder method to transform this matrix into an upper Hessenberg matrix so that I have the eigen formulas down the diagonal. Can this be done?

    I found this sight
    but when I plug my matrices in the example transformation doesn't work.
    So I'm guessing this method will not work for my matrices. Are there other methods (non numerical) that can be used to transform the matrix to an upper diagonal without effecting the eigen values?
  2. jcsd
  3. Jul 24, 2007 #2


    User Avatar
    Science Advisor
    Homework Helper

    You can write down a polynomial equation for the eigenvalues of your matrix as a simple function of the matrix coefficients - and you did that already, of course.

    I don't think that trying to use a different method to derive an equation for the eigvenvalues is going to change the numerical behaviour. Unless there are some relations between your coefficient a, b, c, d, e, you are not going to get a closed-form solution for the roots of a polynomial of degree 5 however you try to do it.

    I suspect your problem is that the roots of a polynomial can be very sensitive to small perturbations in the coefficients. There's a classic example by Wilkinson (c. 1960). He showed that the polynomial with roots 1, 2, .... 20, plus a perturbation term of order 1e-20 in one coefficient, changed many of the roots by large amounts, and several roots changed from real to complex.

    You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision.

    Or you could take a step back and consider what you are going to use the eigenvalues for, and if you really need ultra-high precision eigenvalues. For most physics or engineering applications, you don't need them. Paradoxically, you can easily calculate the eigenvectors to much higher precision than the eigenvalues, using simple methods like inverse iteration.
    Last edited: Jul 24, 2007
  4. Jul 24, 2007 #3
    The idea is to solve for a polynomial using some matrix manipulations not the other way round. There has to be some other methods to solve for eigen values. I have not learned the house holder equations but I know it is just a series of matrix manipulations to transform a matrix into an upper Hessenberg matrix without effecting the eigen values and thus the eigen values are the diagonal. So if i can use the house holder equations I have formulas for the roots of an nth degree polynomial.

    "You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision."
    This just seems to get them further and further off. There seems to be some unstable point at which I'm trying to find. I'd like to find it though. Thus why I want a "non numerical" method.

    The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.

    There has to be a some ways to deal with a matrix without effecting the eigen values and getting it into a useful form to be able to use the householder method.
  5. Jul 24, 2007 #4


    User Avatar
    Science Advisor
    Homework Helper

    It's a while since I looked at the theory, but my memory (based on a lot of work with the EISPACK library in the past) says you finish up finding the eigenvalues of a tridiagonal matrix, numerically.

    But doesn't Galois Theory say that's impossible in the general case, except when n < 5?
  6. Jul 24, 2007 #5


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    In what field do you want the roots? (i.e. rationals, Q(i), some number field, the complexes, some finite field, ...)

    Do you have any constraints on how the root is represented? (Aside from requiring an exact representation)
  7. Jul 25, 2007 #6

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    No. Galois theory states that the generic quintic (or higher degree) poly cannot have its roots expressed as 'nice' functions of its coefficients. This states nothing about

    (i) any specific case: I know what all the roots of x^n-1 are for any n, algebraically

    (ii) the general case using methods other than extracting roots etc: elliptic functions, for example give the roots of degree 5s, I believe.
  8. Jul 25, 2007 #7


    User Avatar
    Science Advisor
    Homework Helper

    OK, I was cutting some corners in my logic there.

    But the matrix transformations (e.g. Householder, Givens, etc) used in the standard eigensolution methods (e.g. the EISPACK library) only involve "nice" functions like sqrt, sin, and cos, so they won't give a complete solution to the OP's question:

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Eigen values