Diagonalization, which eigenvector is found?

In summary, when diagonalizing an NxN matrix, you solve the characteristic equation: Det(A-mI) = 0, which gives you the N eigenvalues m. Then, to find the eigenvectors v of A, you solve the eigenvalue problem: Av = mv, where m is any scalar multiple of an eigenvector of A.
  • #1
FredMadison
47
0
Hi!

This might be a silly question, but I can't seem to figure it out and have not found any remarks on it in the literature.

When diagonalizing an NxN matrix A, we solve the characteristic equation:

Det(A - mI) = 0

which gives us the N eigenvalues m. Then, to find the eigenvectors v of A, we solve the eigenvalue problem

Av = mv

Now, any scalar multiple of an eigenvector of A is itself an eigenvector with the same eigenvalue. So, which eigenvector do we find when solving the eigenvalue problem? It can't be totally random, can it? Is there a way of determining which of the infinitude of eigenvectors (all with the same "direction") the algorithm chooses?
 
Physics news on Phys.org
  • #2
Usually 3 things are done.

Normalize the vector so that [itex]\|v\| = 1[/itex]
Make the smallest entry other than zero equal to 1
Make the largest equal to one

[tex]
\begin{align*}
(1) \Longrightarrow v_0 &= \frac{1}{\sqrt{v^*v}}v\\
(2) \Longrightarrow v_0 &= \frac{1}{\min_{v_i\neq 0}(v_i)}v\\
(3) \Longrightarrow v_0 &= \frac{1}{\max(v_i)}v
\end{elign*}
[/tex]

It is a choice... You can come up yourself with something else anyway
 
  • #3
Thanks trambolin for your answer. However, that was not really my question. I'm aware this is what one usually does when the eigenvectors are found.

My problem is this: When we solve

Av = mv (1)

for the eigenvectors v, we find a set of N eigenvectors. But these are not unique. How does equation (1) "choose" which eigenvector in the 1-d subspace spanned by an eigenvector to "return" since all of them are equally valid?
 
  • #4
Equation (1) doesn't "choose" any specific eigenvector. What typically happens is that you get something like y= 2x, z= 3x- that is, all but one of the components (in the case that there is only one eigenvector corresponding to each eigenvalue) so that the eigenvector is of the form <x, 2x, 3x>= x<1, 2, 3>. You then choose a value of x.

As far as the diagonalization is concerned, it doesn't matter which you choose- using any of the eigenvectors corresponding to the eigenvalues as columns of "P" will still give you a matrix such that [itex]P^{-1}AP[/itex] is diagonal.
 
  • #5
Ofcourse! It WAS a silly question. This is what happens when you stop doing things by hand and start relying too much on Mathematica to do the thinking for you.

Thanks guys!
 

FAQ: Diagonalization, which eigenvector is found?

1. What is diagonalization and why is it important?

Diagonalization is a process used to simplify a matrix by converting it into a diagonal matrix. This is important because diagonal matrices are easier to work with and can provide valuable information about the original matrix, such as its eigenvalues and eigenvectors.

2. How is diagonalization related to eigenvectors?

Diagonalization involves finding the eigenvalues and eigenvectors of a matrix. The eigenvectors are then used to transform the matrix into a diagonal matrix. Therefore, diagonalization is directly related to eigenvectors.

3. Which eigenvector is found during diagonalization?

The eigenvector found during diagonalization is the one that corresponds to the eigenvalue that is being used to create the diagonal matrix. This can vary depending on the specific diagonalization method being used.

4. Can any matrix be diagonalized?

Not all matrices can be diagonalized. A matrix can only be diagonalized if it is square and has a full set of linearly independent eigenvectors. If these conditions are not met, the matrix cannot be diagonalized.

5. What is the significance of finding the eigenvectors during diagonalization?

Finding the eigenvectors during diagonalization allows us to understand the behavior of a matrix and how it affects other vectors. Eigenvectors also provide insight into the underlying structure of a matrix and can be used in various applications such as data compression and image processing.

Similar threads

Replies
5
Views
489
Replies
5
Views
4K
Replies
1
Views
2K
Replies
1
Views
3K
Replies
1
Views
5K
Replies
4
Views
3K
Replies
7
Views
2K
Replies
1
Views
1K
Back
Top