Linearly independent eigen vectors

In summary, Jonnyb302 is attempting to solve a diffusion problem and is looking for a symbolic proof to show that there are n linearly independent eigen vectors. He is having trouble with the first row of the matrix and is looking for help.
  • #1
Jonnyb302
22
0
Hello everyone, this nxn matrix arises in my numerical scheme for solving a diffusion PDE.
[itex]
M =
\left(\begin{array}{cccccccccc}1-\frac{Dk}{Vh} & \frac{Dk}{Vh} & 0 & 0 & & & \ldots & & & 0
\\[6pt] \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2} & 0 & & & & & &
\\[6pt]0 & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} &\frac{Dk}{h^2} & & & & & &
\\
\\
\\\vdots & & & & \ddots & & & & & \vdots
\\
\\
\\
\\[6pt] & & & & & & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2} & 0
\\[6pt] & & & & & & & \frac{Dk}{h^2} & 1-2\frac{Dk}{h^2} & \frac{Dk}{h^2}
\\[6pt] 0 &\ldots & & & & & & & 2\frac{Dk}{h^2} & 1-2\frac{Dk}{h^2}
\end{array}\right)
[/itex]

I can easily use Gershgorin disks, and the freedom to set a constraint between h and k to guarantee that all eigen vectors are between -1 and 1, or even between 0 and 1 if that is more advantageous.

I need to prove I can diagonalize this matrix, so I am attempting to show that there are n linearly independent eigen vectors.

I considered trying to show that the eigen vectors are distinct but I am really not sure where to start.

I seem to remember one method being something like guessing [itex]λ = cos(2\pi/n)[/itex] but I don't see how to go form there.

Any suggestions? I have lots of experience in PDE's and ODE's but have no formal linear algebra experience, just what I have taught myself.
 
Physics news on Phys.org
  • #2
Hey Jonnyb302 and welcome to the forums.

Since you are going to look for a symbolic proof, it's probably going to be necessary to get a symbolic representation of the eigenvalues. You have a lot of zero's which is good for you since you get a simplification.

I don't know whether this matrix has a particular form, but if you know of one that fits it, you might want to see if people have come up with a reduced form for the determinant (with the eigenvalue lambda included).

Then the next step will be to do a symbolic evaluation of the eigenvectors and then using the theorem of independence that states that if all vectors are independent then the only thing that satisfies Sum_over_i (a_i*x_i) = 0 is when all a_i real number coeffecients are zero (where the x_i's are 0).

Just out of curiosity, what kind of field is this? I'm going to speculate by saying if this is a common problem, then some one has probably taken a look at it and if they haven't answer it fully, then maybe partial results exist (like the value of the determinant say and its calculation in symbolic reduced form).
 
  • #3
I did take a look and try to find a classification of this matrix, but I did not see anything.

If the first row was not present then I could easily do it, as the determinant of the nxn matrix can be represented through a recursive relation of the determinant of the (n-1)x(n-1) and the (n-2)x(n-2) determinants.

Unfortunately the first row is causing almost all my troubles.

This is close to a common problem. A typical easy problem is the flow of heat from an infinite heat reservoir into a rod in a vacuum. This problem represents the flow of heat from a finite reservoir, and as you can probably guess, the first row which breaks the symmetry represents that finiteness.

In truth what I need to prove is that the spectral norm of this matrix is less than 1, however using my method so far to prove this, it all reduces to showing there is a linearly independent eigen basis.

Although, I think I am starting to see a way to do this. I am going to pursue it and if I figure it out I will post my result.
 
  • #4
ok I think I got it.

http://en.wikipedia.org/wiki/Determinant#Block_matrices

In the block matrices section. I can think of my matrix has the simple case where the block C is a (n-1)x1 matrix off all zeros. So I can get the eigen value from the top left corner element of the matrix (itself) and I already know how to find the eigen values of my matrix if you chop off the top row and left column via:
http://digilander.libero.it/foxes/matrix/eigen_tridiad_uniform.pdf

So then all I do is show that they are all distinct eigen values, and I have proven I have n linearly independent eigen vectors.

yay?
 
  • #5


Hello, thank you for sharing your matrix and the problem you are facing. Linearly independent eigen vectors are a fundamental concept in linear algebra and have many applications in various fields, including PDEs. In order to prove that your matrix can be diagonalized, you need to show that there are n linearly independent eigen vectors, where n is the dimension of your matrix.

One approach to finding linearly independent eigen vectors is to use the characteristic polynomial of the matrix. The characteristic polynomial is defined as det(λI - M), where λ is a scalar and I is the identity matrix. The roots of this polynomial will be the eigenvalues of the matrix. In order for the eigen vectors to be linearly independent, the eigenvalues must be distinct. Therefore, you can start by finding the roots of the characteristic polynomial and checking if they are distinct.

Another approach is to use the fact that for an nxn matrix, there can be at most n linearly independent eigen vectors. So, if you can find n linearly independent eigen vectors, then you have proven that the matrix can be diagonalized. One way to do this is to use the Gram-Schmidt process, which is a method for finding an orthogonal basis for a set of vectors. By finding an orthogonal basis for the eigen vectors, you can show that they are linearly independent.

You mentioned the Gershgorin disks, which can be used to show that all the eigenvalues of your matrix are within a certain range. This can be helpful in narrowing down the search for the eigenvalues and eigen vectors, but it does not guarantee that the eigen vectors will be linearly independent.

In terms of your approach of guessing λ = cos(2π/n), this is a valid approach but it may not always work. It depends on the specific matrix and eigenvalues you are trying to find. It is always helpful to try different methods and approaches to see which one works best in your specific case.

In summary, to prove that your matrix can be diagonalized, you need to show that there are n linearly independent eigen vectors. This can be done by finding distinct eigenvalues using the characteristic polynomial or by finding an orthogonal basis for the eigen vectors using the Gram-Schmidt process. I hope this helps and good luck with your research!
 

1. What is the definition of linearly independent eigen vectors?

Linearly independent eigen vectors are a set of vectors that cannot be expressed as a linear combination of each other. In other words, no vector in the set can be written as a multiple of another vector in the set.

2. Why is it important for eigen vectors to be linearly independent?

Linearly independent eigen vectors are important because they form a basis for the vector space. This means that any vector in the vector space can be represented as a unique linear combination of the eigen vectors. This property is useful in various mathematical and scientific applications.

3. How can we determine if a set of eigen vectors is linearly independent?

To determine if a set of eigen vectors is linearly independent, we can use the determinant test. This involves constructing a matrix with the eigen vectors as its columns and calculating its determinant. If the determinant is non-zero, then the eigen vectors are linearly independent.

4. Can a set of linearly dependent vectors also be eigen vectors?

Yes, a set of linearly dependent vectors can also be eigen vectors. However, in this case, the eigen vectors do not form a basis for the vector space since they can be written as linear combinations of each other.

5. What is the relationship between eigen vectors and eigenvalues?

Eigen vectors and eigenvalues are related in that an eigenvalue represents the scalar factor by which an eigen vector is stretched or compressed when it is multiplied by a matrix. Each eigen vector has a corresponding eigenvalue, and the set of all eigen vectors and eigenvalues of a matrix form its eigendecomposition.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
19
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
774
  • Atomic and Condensed Matter
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
919
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
666
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
948
  • Linear and Abstract Algebra
Replies
12
Views
1K
Back
Top