# Eigen values

## Main Question or Discussion Point

I would like to get the eigen values of a sparse matrix

of form

[ a, b, c, d, e;
1, 0, 0, 0, 0;
0, 1, 0, 0, 0;
0, 0, 1, 0, 0;
0, 0, 0, 1, 0]

in matlab/octave I use the following to generate the matrix

% filename nfactor.m written by Jeff Sadowski
% nfactor finds the factors for an nth degree polynomial
% if you have an equation a(x^n) + b(x^(n-1)) + c(x^(n-2)) + ...
% use nfactor like so nfactor([a,b,c,...])
% example: equation=x^2 + 2*x + 1
% octave:1> nfactor([1,2,1])
% ans =
%
% -1
% -1
%
function ans=nfactor(VECTOR) vsize=size(VECTOR);
ans=eig([-1/VECTOR(1)*VECTOR(2:vsize(2));eye(vsize(2)-2,vsize(2)-1)]);

However I need higher precision than this more accurately I need a resultant formula for each eigen value. I was hoping to use something like the householder method to transform this matrix into an upper Hessenberg matrix so that I have the eigen formulas down the diagonal. Can this be done?

I found this sight
http://www.matf.bg.ac.yu/r3nm/NumericalMethods/EigenValues/Householder.html [Broken]
but when I plug my matrices in the example transformation doesn't work.
So I'm guessing this method will not work for my matrices. Are there other methods (non numerical) that can be used to transform the matrix to an upper diagonal without effecting the eigen values?

Last edited by a moderator:

AlephZero
Homework Helper
You can write down a polynomial equation for the eigenvalues of your matrix as a simple function of the matrix coefficients - and you did that already, of course.

I don't think that trying to use a different method to derive an equation for the eigvenvalues is going to change the numerical behaviour. Unless there are some relations between your coefficient a, b, c, d, e, you are not going to get a closed-form solution for the roots of a polynomial of degree 5 however you try to do it.

I suspect your problem is that the roots of a polynomial can be very sensitive to small perturbations in the coefficients. There's a classic example by Wilkinson (c. 1960). He showed that the polynomial with roots 1, 2, .... 20, plus a perturbation term of order 1e-20 in one coefficient, changed many of the roots by large amounts, and several roots changed from real to complex.

You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision.

Or you could take a step back and consider what you are going to use the eigenvalues for, and if you really need ultra-high precision eigenvalues. For most physics or engineering applications, you don't need them. Paradoxically, you can easily calculate the eigenvectors to much higher precision than the eigenvalues, using simple methods like inverse iteration.

Last edited:
The idea is to solve for a polynomial using some matrix manipulations not the other way round. There has to be some other methods to solve for eigen values. I have not learned the house holder equations but I know it is just a series of matrix manipulations to transform a matrix into an upper Hessenberg matrix without effecting the eigen values and thus the eigen values are the diagonal. So if i can use the house holder equations I have formulas for the roots of an nth degree polynomial.

"You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision."
This just seems to get them further and further off. There seems to be some unstable point at which I'm trying to find. I'd like to find it though. Thus why I want a "non numerical" method.

The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.

There has to be a some ways to deal with a matrix without effecting the eigen values and getting it into a useful form to be able to use the householder method.

AlephZero
Homework Helper
have not learned the house holder equations but I know it is just a series of matrix manipulations to transform a matrix into an upper Hessenberg matrix without effecting the eigen values and thus the eigen values are the diagonal. So if i can use the house holder equations I have formulas for the roots of an nth degree polynomial.
It's a while since I looked at the theory, but my memory (based on a lot of work with the EISPACK library in the past) says you finish up finding the eigenvalues of a tridiagonal matrix, numerically.

The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.
But doesn't Galois Theory say that's impossible in the general case, except when n < 5?

Hurkyl
Staff Emeritus
Gold Member
The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.
In what field do you want the roots? (i.e. rationals, Q(i), some number field, the complexes, some finite field, ...)

Do you have any constraints on how the root is represented? (Aside from requiring an exact representation)

matt grime
Homework Helper
But doesn't Galois Theory say that's impossible in the general case, except when n < 5?
No. Galois theory states that the generic quintic (or higher degree) poly cannot have its roots expressed as 'nice' functions of its coefficients. This states nothing about

(i) any specific case: I know what all the roots of x^n-1 are for any n, algebraically

(ii) the general case using methods other than extracting roots etc: elliptic functions, for example give the roots of degree 5s, I believe.

AlephZero