# Linearly independent eigen vectors

1. Jul 31, 2012

### Jonnyb302

Hello everyone, this nxn matrix arises in my numerical scheme for solving a diffusion PDE.
$M = \left(\begin{array}{cccccccccc}1-\frac{Dk}{Vh} & \frac{Dk}{Vh} & 0 & 0 & & & \ldots & & & 0 \\[6pt] \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2} & 0 & & & & & & \\[6pt]0 & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} &\frac{Dk}{h^2} & & & & & & \\ \\ \\\vdots & & & & \ddots & & & & & \vdots \\ \\ \\ \\[6pt] & & & & & & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2} & 0 \\[6pt] & & & & & & & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2} \\[6pt] 0 &\ldots & & & & & & & 2\frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} \end{array}\right)$

I can easily use Gershgorin disks, and the freedom to set a constraint between h and k to guarantee that all eigen vectors are between -1 and 1, or even between 0 and 1 if that is more advantageous.

I need to prove I can diagonalize this matrix, so I am attempting to show that there are n linearly independent eigen vectors.

I considered trying to show that the eigen vectors are distinct but I am really not sure where to start.

I seem to remember one method being something like guessing $λ = cos(2\pi/n)$ but I don't see how to go form there.

Any suggestions? I have lots of experience in PDE's and ODE's but have no formal linear algebra experience, just what I have taught myself.

2. Aug 1, 2012

### chiro

Hey Jonnyb302 and welcome to the forums.

Since you are going to look for a symbolic proof, it's probably going to be necessary to get a symbolic representation of the eigenvalues. You have a lot of zero's which is good for you since you get a simplification.

I don't know whether this matrix has a particular form, but if you know of one that fits it, you might want to see if people have come up with a reduced form for the determinant (with the eigenvalue lambda included).

Then the next step will be to do a symbolic evaluation of the eigenvectors and then using the theorem of independence that states that if all vectors are independent then the only thing that satisfies Sum_over_i (a_i*x_i) = 0 is when all a_i real number coeffecients are zero (where the x_i's are 0).

Just out of curiosity, what kind of field is this? I'm going to speculate by saying if this is a common problem, then some one has probably taken a look at it and if they haven't answer it fully, then maybe partial results exist (like the value of the determinant say and its calculation in symbolic reduced form).

3. Aug 1, 2012

### Jonnyb302

I did take a look and try to find a classification of this matrix, but I did not see anything.

If the first row was not present then I could easily do it, as the determinant of the nxn matrix can be represented through a recursive relation of the determinant of the (n-1)x(n-1) and the (n-2)x(n-2) determinants.

Unfortunately the first row is causing almost all my troubles.

This is close to a common problem. A typical easy problem is the flow of heat from an infinite heat reservoir into a rod in a vacuum. This problem represents the flow of heat from a finite reservoir, and as you can probably guess, the first row which breaks the symmetry represents that finiteness.

In truth what I need to prove is that the spectral norm of this matrix is less than 1, however using my method so far to prove this, it all reduces to showing there is a linearly independent eigen basis.

Although, I think I am starting to see a way to do this. I am going to persue it and if I figure it out I will post my result.

4. Aug 1, 2012

### Jonnyb302

ok I think I got it.

http://en.wikipedia.org/wiki/Determinant#Block_matrices

In the block matrices section. I can think of my matrix has the simple case where the block C is a (n-1)x1 matrix off all zeros. So I can get the eigen value from the top left corner element of the matrix (itself) and I already know how to find the eigen values of my matrix if you chop off the top row and left column via: