Order of Eigenvectors in Matrix Generation: Does it Make a Difference?

In summary, eigenvectors are helpful in diagonalizing matrices because they tell you about the properties of the matrix.
  • #1
squaremeplz
124
0

Homework Statement



When generating a matrix from eigenvectors, does it matter in which order
the columns are placed?
 
Physics news on Phys.org
  • #2
No, as long as the column of the eigenvector is the same as the column of the corresponding eigenvalue it should all be fine. This is due to how the diagonalization is made, the first matrix from the right transforms the vector into the eigenbasis of the matrix, the second matrix is the original matrix's diagonal form while the third transforms it back to the basis you started out from.

The reason you use eigenvectors and such is because it is often much easier to work with matrices in this form and it also tells you a lot about its properties.
 
  • #3
If you use the eigenvectors as columns (and assuming that the eigenvectors form a basis for the space) the matrix will be a diagonal matrix with the eigenvalues on the diagonal. Changing the order of the eignvectors will change the order of those eigenvalues but you still get a matrix representing that linear transformation.
 
  • #4
Sorry I am trying to diagonalize the matrix h

I used MATLAB to check my results:

h =

2 1 0
1 2 1
0 0 2

>> [e,r] = eig(h)

e =

0.7071 -0.7071 -0.7071
0.7071 0.7071 0
0 0 0.7071r =

3 0 0
0 1 0
0 0 2The eigenvalues I get are 1;2;3 and and my eigenvector matrix is a bit differnt than e when I do it out by hand:

1 -1 -1
1 1 0
0 0 1

but since c[1;1;0] where c is any scalar; I am assuming it's the same thing due to the ratios. However, you are saying that I can use the matrix r instead? I know the arithmetic, this is just a bit confusing. Thanks again.
I think r might be my end result, but the way I am trying to get it is

e^(-1) * h * e = r

is this correct?
 
Last edited:
  • #5
Matlab normalized the eigenvectors, other than that your solution is the same.
 
  • #6
I figured as much.. but why did it make the matrix in that order? Does it have to do with the diagonal result?
 
  • #7
squaremeplease said:
Sorry I am trying to diagonalize the matrix h

I used MATLAB to check my results:

h =

2 1 0
1 2 1
0 0 2

>> [e,r] = eig(h)

e =

0.7071 -0.7071 -0.7071
0.7071 0.7071 0
0 0 0.7071
This is, exactly,
[tex]\begin{bmatrix}\frac{\sqrt{2}}{2} & -\frac{\sqrt{2}}{2}& -\frac{\sqrt{2}}{2} \\ \frac{\sqrt{2}}{2} & \frac{\sqrt{2}}{2} & 0 \\ 0 & 0 & \frac{\sqrt{2}}{2}\end{bmatrix}[/tex]

r =

3 0 0
0 1 0
0 0 2


The eigenvalues I get are 1;2;3 and and my eigenvector matrix is a bit differnt than e when I do it out by hand:

1 -1 -1
1 1 0
0 0 1
and if you normalize the vectors first, this is
[tex]\begin{bmatrix} \frac{\sqrt{2}}{2} & -\frac{\sqrt{2}}{2} & -\frac{\sqrt{2}}{2} \\ \frac{\sqrt{2}}{2} & \frac{\sqrt{2}}{2} & 0 \\ 0 & 0 & 1\end{bmatrix}[/tex]

which is exactly as before.

but since c[1;1;0] where c is any scalar; I am assuming it's the same thing due to the ratios. However, you are saying that I can use the matrix r instead? I know the arithmetic, this is just a bit confusing. Thanks again.
I think r might be my end result, but the way I am trying to get it is

e^(-1) * h * e = r

is this correct?
Yes, did you try it? If e is
[tex]\begin{bmatrix}1 & -1 & -1 \\ 1 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}[/tex]
what is [itex]e^{-1}[/itex]? What is [itex]e^{-1}he[/itex]?
 
  • #8
Yep, I did try it all out and i got the diagonal

it took some time to get the inverse correct by hand but looking back at it
it's pretty straight forward.

Thanks for your help :)
 
  • #9
I think the placement of the eigenvectors is also not crucial due to the nature of the eigenvectors.

e^-1 and e act to still give the diagonal I believe. but i will try that some other time
 
  • #10
Each eigenvector is associated, of course, with a specific eigenvalue. Changing the order of the eigenvectors as rows will, as I said before, change the order of the eigenvalues on the diagonal. The matrix you give has columns <1, 1, 0>, <-1, 1, 0>, and <-1, 0, 1>, from left to right, which are eigenvectors corresponding to eigenvalues, 3, 1, and 2 in that order. Using that gives a diagonal matrix with3 in the top row, 1 in the second row, and 2 in the third row:
[tex]\begin{bmatrix}3 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 2\end{bmatrix}[/tex]
If you swapped those columns around so that <-1, 1, 0>was the first column, <-1, 0, 1> the second column, and <1, 1, 0> the third column, then you would get the diagonal matrix
[tex]\begin{bmatrix}1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 3\end{bmatrix}[/tex]
 
  • #11
That is awesome.

Great explanation. I figure this will be the only useful linear algebra technique so this is very helpful.

Go your brain!
 

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are mathematical concepts used to describe the properties of a linear transformation. Eigenvalues represent the scalar values that are scaled by the eigenvectors when the transformation is applied.

What is the importance of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important because they help us understand how a linear transformation affects a vector. They also have many practical applications in fields such as physics, engineering, and computer science.

How do you find eigenvalues and eigenvectors?

To find eigenvalues and eigenvectors, you need to solve the characteristic equation of a matrix. This involves finding the values of lambda that satisfy the equation det(A-lambda*I) = 0, where A is the matrix and I is the identity matrix. The corresponding eigenvectors can then be found by solving the equation (A-lambda*I)x = 0.

What is the relationship between eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are closely related. Each eigenvalue has a corresponding eigenvector, and the eigenvector represents the direction in which the transformation is scaled by the eigenvalue. Additionally, eigenvectors are always orthogonal to each other.

How are eigenvalues and eigenvectors used in data analysis?

Eigenvalues and eigenvectors are used in data analysis to reduce the dimensionality of a dataset. This is done by finding the principal components, which are linear combinations of the original variables that explain the most variance in the data. These principal components are the eigenvectors of the covariance matrix of the data, and their corresponding eigenvalues represent the amount of variance explained by each component.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
517
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
332
  • Calculus and Beyond Homework Help
Replies
2
Views
520
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top