Finding the eigenvectors of a matrix A

In summary, the spectrum of the given matrix A is {2, -2, 1} and the corresponding eigenvectors are (1, 0, 0), (1, -4, 0), and (1, -1, -3) respectively.
  • #1
TheSodesa
224
7

Homework Statement


[tex]
A = \begin{bmatrix}
2 & 1 & 0\\
0& -2 & 1\\
0 & 0 & 1
\end{bmatrix}
[/tex]

Homework Equations

The Attempt at a Solution



The spectrum of A is [itex] \sigma (A) = { \lambda _1, \lambda _2, \lambda _3 } = {2, -2, 1 } [/itex]

I was able to calculate vectors [itex]v_1[/itex] and [itex]v_3[/itex] correctly out of the vectors on the following page:
http://www.wolframalpha.com/input/?i=eigenvectors+of+{{2,1,0},{0,-2,1},{0,0,1}}

However, [itex]v_2[/itex] is giving me a headache. Using [itex] \lambda _1[/itex] to solve [itex]A - \lambda _1 I_3[/itex] gives me the matrix

[tex]
\begin{bmatrix}
\stackrel{a}{0} & \stackrel{b}{1} & \stackrel{c}{0}\\
0 & -4 & 1\\
0 & 0 & -1
\end{bmatrix}

\stackrel{rref}{=}

\begin{bmatrix}
\stackrel{a}{0} & \stackrel{b}{1} & \stackrel{c}{0}\\
0 & 0 & 1\\
0 & 0 & 0
\end{bmatrix}
[/tex]

In my head this would produce a zero eigenvector since

[tex]\begin{cases} b = 0 \\ c = 0 \\ (a = ?) \end{cases}[/tex]

This is of course nonsense. I'm probably interpreting the row-reduced matrix wrong, but what is it exactly that I'm not understanding? Does it have something to do with the fact that on every row a = 0?
 
Physics news on Phys.org
  • #2
To maybe answer my own question, (someone correct me if I'm wrong):

If each row in the above matrix represents an equation, I could theoretically make [itex]a = s, s \in \mathbb{R} [/itex], since we're solving for the null space, meaning that since [itex]a[/itex] is always zero, it is equal to each the components of a zero vector, and we could add aything to either side of the equation and it would still hold, transforming the system of equations into:

[tex]
\begin{cases}
a = s\\
b=0\\
c=0
\end{cases}
[/tex]

Is this the solution?
 
  • #3
a can then be any value you could choose 1 to normalize.
 
  • Like
Likes TheSodesa
  • #4
jk22 said:
a can then be any value you could choose 1 to normalize.

Got it. Thank you for confirming my suspicions.
 
  • #5
l would prefer to use the basic definitions of "eigenvalues" and "eigenvectors". Saying that "2" is an eigenvalue means that there exist a non-zero vector (in fact an entire subspace of them), (x, y, z), such that
[tex]\begin{bmatrix} 2 & 1 & 0 \\ 0 & -2 & 1 \\ 0 & 0 & 1 \end{bmatrix}\begin{bmatrix}x \\y \\ z \end{bmatrix}= \begin{bmatrix}2x+ y \\ -2y+ z \\ z \end{bmatrix}= 2\begin{bmatrix}x \\ y \\ z \end{bmatrix}[/tex]
which gives the three equations 2x+ y= 2x, -2y+ z= 2y, z= 2z. The first equation is the same as y= 0 and the last z= 0. But there is no condition on x. Any vector of the form (x, 0, 0)= x(1, 0, 0) is an eigenvector corresponding to eigenvalue 2.

Similarly, the eigenvectors corresponding to eigenvalue -2 must satisfy 2x+ y= -2x, -2y+ z= -2y, z= -2z. We get z= 0 from the last equation but see that, then, any y satisfies the second and the first says y= -4x. So (x, -4x, 0)= x(1, -4, 0) is an eigenvector corresponding to eigenvalue -2.

Finally, eigenvectors corresponding to eigenvalue 1 must satisfy 2x+ y= x, -2y+ z= y, z= z. The last equation is always true, the second says z= 3y, and the first y= -x so that z= 3y= -3x. An eigenvector corresponding to eigenvalue 1 is (x, -x, -3x)= x(1, -1, -3).
 

What is an eigenvector?

An eigenvector is a vector that, when multiplied by a matrix, results in a scalar multiple of itself. In other words, the direction of an eigenvector does not change when it is multiplied by a matrix, only its magnitude.

Why is finding eigenvectors important?

Finding eigenvectors is important because they can provide insight into the behavior and properties of a matrix. They are used in various applications, such as solving differential equations, image processing, and data analysis.

How do you find the eigenvectors of a matrix?

To find the eigenvectors of a matrix, you first need to calculate the eigenvalues of the matrix. Then, for each eigenvalue, you can solve for the corresponding eigenvectors by setting up and solving a system of linear equations.

Can a matrix have more than one eigenvector?

Yes, a matrix can have multiple eigenvectors for the same eigenvalue. In fact, it is common for matrices to have multiple eigenvectors, each corresponding to a different eigenvalue.

What is the significance of the eigenvectors with the largest eigenvalues?

The eigenvectors with the largest eigenvalues are known as the dominant eigenvectors and have special significance in applications such as principal component analysis. They can represent the most important or influential components of a matrix.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
9
Views
808
  • Precalculus Mathematics Homework Help
Replies
21
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
733
  • Differential Equations
Replies
2
Views
1K
Replies
2
Views
427
  • Calculus and Beyond Homework Help
Replies
2
Views
397
  • Linear and Abstract Algebra
Replies
10
Views
146
  • Precalculus Mathematics Homework Help
Replies
31
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
797
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
Back
Top