Calculating Eigen Values of a Sparse Matrix

  • Context: Graduate 
  • Thread starter Thread starter jeff.sadowski
  • Start date Start date
  • Tags Tags
    Eigen values Matrix
Click For Summary

Discussion Overview

The discussion revolves around calculating the eigenvalues of a specific sparse matrix using various methods, including numerical and non-numerical approaches. Participants explore the limitations of existing techniques and the desire for a systematic method to derive exact roots of an nth degree polynomial, particularly in the context of eigenvalue problems.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks a method to derive eigenvalues of a sparse matrix with higher precision, suggesting the use of the Householder method to transform the matrix into an upper Hessenberg form.
  • Another participant argues that deriving a closed-form solution for the roots of a fifth-degree polynomial is unlikely without specific relations among coefficients.
  • Concerns are raised about the sensitivity of polynomial roots to perturbations in coefficients, referencing a classic example by Wilkinson.
  • Some participants propose that while eigenvalues may be difficult to compute with high precision, eigenvectors can often be determined with greater accuracy using methods like inverse iteration.
  • There is a discussion about the systematic approach to finding exact roots of nth degree polynomials, including the consideration of imaginary roots.
  • Galois Theory is mentioned, with participants debating its implications for expressing roots of polynomials of degree five or higher.
  • One participant questions the feasibility of obtaining exact roots through matrix transformations, noting that standard eigensolution methods involve functions that may not yield complete solutions.

Areas of Agreement / Disagreement

Participants express differing views on the feasibility of obtaining exact roots for higher-degree polynomials and the effectiveness of various methods. There is no consensus on a definitive approach to the problem, and multiple competing perspectives remain throughout the discussion.

Contextual Notes

Participants highlight the limitations of numerical methods and the challenges posed by the general case of polynomial roots, particularly in relation to Galois Theory. The discussion also reflects on the need for precise definitions and constraints when seeking exact representations of roots.

jeff.sadowski
Messages
11
Reaction score
0
I would like to get the eigen values of a sparse matrix

of form

[ a, b, c, d, e;
1, 0, 0, 0, 0;
0, 1, 0, 0, 0;
0, 0, 1, 0, 0;
0, 0, 0, 1, 0]

in matlab/octave I use the following to generate the matrix

% filename nfactor.m written by Jeff Sadowski
% nfactor finds the factors for an nth degree polynomial
% if you have an equation a(x^n) + b(x^(n-1)) + c(x^(n-2)) + ...
% use nfactor like so nfactor([a,b,c,...])
% example: equation=x^2 + 2*x + 1
% octave:1> nfactor([1,2,1])
% ans =
%
% -1
% -1
%
function ans=nfactor(VECTOR) vsize=size(VECTOR);
ans=eig([-1/VECTOR(1)*VECTOR(2:vsize(2));eye(vsize(2)-2,vsize(2)-1)]);

However I need higher precision than this more accurately I need a resultant formula for each eigen value. I was hoping to use something like the householder method to transform this matrix into an upper Hessenberg matrix so that I have the eigen formulas down the diagonal. Can this be done?

I found this sight
http://www.matf.bg.ac.yu/r3nm/NumericalMethods/EigenValues/Householder.html
but when I plug my matrices in the example transformation doesn't work.
So I'm guessing this method will not work for my matrices. Are there other methods (non numerical) that can be used to transform the matrix to an upper diagonal without effecting the eigen values?
 
Last edited by a moderator:
Physics news on Phys.org
You can write down a polynomial equation for the eigenvalues of your matrix as a simple function of the matrix coefficients - and you did that already, of course.

I don't think that trying to use a different method to derive an equation for the eigvenvalues is going to change the numerical behaviour. Unless there are some relations between your coefficient a, b, c, d, e, you are not going to get a closed-form solution for the roots of a polynomial of degree 5 however you try to do it.

I suspect your problem is that the roots of a polynomial can be very sensitive to small perturbations in the coefficients. There's a classic example by Wilkinson (c. 1960). He showed that the polynomial with roots 1, 2, ... 20, plus a perturbation term of order 1e-20 in one coefficient, changed many of the roots by large amounts, and several roots changed from real to complex.

You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision.

Or you could take a step back and consider what you are going to use the eigenvalues for, and if you really need ultra-high precision eigenvalues. For most physics or engineering applications, you don't need them. Paradoxically, you can easily calculate the eigenvectors to much higher precision than the eigenvalues, using simple methods like inverse iteration.
 
Last edited:
The idea is to solve for a polynomial using some matrix manipulations not the other way round. There has to be some other methods to solve for eigen values. I have not learned the house holder equations but I know it is just a series of matrix manipulations to transform a matrix into an upper Hessenberg matrix without effecting the eigen values and thus the eigen values are the diagonal. So if i can use the house holder equations I have formulas for the roots of an nth degree polynomial.

"You could refine the approximate values you already have by something like Newton-Raphson iteration using extended precision."
This just seems to get them further and further off. There seems to be some unstable point at which I'm trying to find. I'd like to find it though. Thus why I want a "non numerical" method.

The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.

There has to be a some ways to deal with a matrix without effecting the eigen values and getting it into a useful form to be able to use the householder method.
 
jeff.sadowski said:
have not learned the house holder equations but I know it is just a series of matrix manipulations to transform a matrix into an upper Hessenberg matrix without effecting the eigen values and thus the eigen values are the diagonal. So if i can use the house holder equations I have formulas for the roots of an nth degree polynomial.

It's a while since I looked at the theory, but my memory (based on a lot of work with the EISPACK library in the past) says you finish up finding the eigenvalues of a tridiagonal matrix, numerically.

The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.

But doesn't Galois Theory say that's impossible in the general case, except when n < 5?
 
jeff.sadowski said:
The step back is I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial. I want the imaginary roots as well.
In what field do you want the roots? (i.e. rationals, Q(i), some number field, the complexes, some finite field, ...)

Do you have any constraints on how the root is represented? (Aside from requiring an exact representation)
 
AlephZero said:
But doesn't Galois Theory say that's impossible in the general case, except when n < 5?

No. Galois theory states that the generic quintic (or higher degree) poly cannot have its roots expressed as 'nice' functions of its coefficients. This states nothing about

(i) any specific case: I know what all the roots of x^n-1 are for any n, algebraically

(ii) the general case using methods other than extracting roots etc: elliptic functions, for example give the roots of degree 5s, I believe.
 
matt grime said:
No. Galois theory states that the generic quintic (or higher degree) poly cannot have its roots expressed as 'nice' functions of its coefficients.

OK, I was cutting some corners in my logic there.

But the matrix transformations (e.g. Householder, Givens, etc) used in the standard eigensolution methods (e.g. the EISPACK library) only involve "nice" functions like sqrt, sin, and cos, so they won't give a complete solution to the OP's question:

I want a systematic approach at finding the "exact" (non numerical method) roots of an nth degree polynomial.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
6K