Eigenvectors for degenerate eigenvalues

In summary: The equation ##\begin{bmatrix} x \\ y \\ z \end{bmatrix} = y\begin{bmatrix} 3 \\ 1 \\ 0 \end{bmatrix} + z\begin{bmatrix} -1 \\ 0 \\ 1\end{bmatrix}##, where y and z are arbitrary, is just a way to express this relationship.
  • #1
dyn
773
61
I am looking at some notes on Linear algebra written for maths students mainly to improve my Quantum Mechanics. I came across the following example - $$ \begin{pmatrix} 2 & -3 & 1 \\ 1 & -2 & 1 \\ 1 & -3 & 2 \end{pmatrix} $$
The example then gives the eigenvalues as 0 and 1(doubly degenerate). It then calculates the eigenvectors using Gaussian elimination. This is where my problem arises - coming from a physics background I tried to find the eigenvectors for the repeated eigenvalue 1 using back substitution but it doesn't seem to produce a solution this way. Am I doing something wrong or is it possible for back substitution not to work while Gaussian elimination works ?
The answer given for the eigenvector is a linear combination of the 2 vectors ( 3 1 0 )T and (-1 0 1)T. In the Quantum Mechanics textbook I am using it says for degenerate eigenvalues to choose 2 mutually orthogonal vectors. The 2 vectors I have listed are not orthogonal. Is the orthogonal part just a preference for QM and not a requirement ?
Thanks
 
Physics news on Phys.org
  • #2
You can turn them into an orthogonal pair by subtracting from one the projection of the other onto it.

Given two linearly independent vectors ##\vec u,\vec v##, the pair ##\vec u-\frac{\vec u\cdot \vec v}{\vec v\cdot\vec v}\vec v, \vec v## is orthogonal. You can check that by calculating ##(\vec u-\frac{\vec u\cdot \vec v}{\vec v\cdot\vec v}\vec v)\cdot \vec v##
 
  • #3
So choosing the eigenvectors as orthogonal is just a matter of preference. Thanks. Any thoughts on why I can't calculate the eigenvectors by back substitution but it can be done by Gaussian elimination ?
 
  • #4
If I apply a general vector ( a b c )T to the eigenvalue equation with eigenvalue 1 , I end up with 3 equations exactly the same a-3b+c=0. How do I then proceed to end up with the answer given which is equivalent to ( 3x-y , x , y )T
 
  • #5
The equation 'a-3b+c=0' can be written as 'a=3b-c' which just says that for any an eigenvector with Eigenvalue 1, whose2nd and 3rd components are b,c, the first component is 3b-c.

Relabel a,b,c as x,y,z and you have the given answer.
 
  • #6
dyn said:
If I apply a general vector ( a b c )T to the eigenvalue equation with eigenvalue 1 , I end up with 3 equations exactly the same a-3b+c=0. How do I then proceed to end up with the answer given which is equivalent to ( 3x-y , x , y )T
Elaborating on what andrewkirk said, relabel the equation above as x - 3y + z = 0.

Then
x = 3y - z
y = y
z = ... z
If you look at the right sides as a sum of two vectors, you get
##\begin{bmatrix} x \\ y \\ z \end{bmatrix} = y\begin{bmatrix} 3 \\ 1 \\ 0 \end{bmatrix} + z\begin{bmatrix} -1 \\ 0 \\ 1\end{bmatrix}##

Here y and z on the right side can be considered arbitrary constants.
 
  • #7
Thanks for your replies. So essentially because I end up with 3 equations that are the same I really have just one equation with 3 unknowns. So I take 2 of those unknowns to have arbitrary values and the express the remaining unknown in terms of the 2 arbitrary values
 
  • #8
dyn said:
Thanks for your replies. So essentially because I end up with 3 equations that are the same I really have just one equation with 3 unknowns. So I take 2 of those unknowns to have arbitrary values and the express the remaining unknown in terms of the 2 arbitrary values
Yes. In the work I showed, you can take y = 1 and z = 0, and get one solution, and you can take y = 0, z = 1, to get another solution. Since y and z are completely arbitrary, you get a double infinity of solutions.

Geometrically, the two vectors I showed determine a plane in R3. Every point in this plane is some linear combination of those two vectors.
 
  • Like
Likes dyn

1. What are eigenvectors for degenerate eigenvalues?

Eigenvectors for degenerate eigenvalues are vectors that satisfy the property that when multiplied by a square matrix, the resulting vector is a scalar multiple of the original vector. This scalar multiple is known as the eigenvalue.

2. How are eigenvectors for degenerate eigenvalues different from regular eigenvectors?

Eigenvectors for degenerate eigenvalues are different from regular eigenvectors in that they correspond to eigenvalues that have a multiplicity greater than one. This means that there are multiple eigenvectors that correspond to the same eigenvalue.

3. Why are eigenvectors for degenerate eigenvalues important?

Eigenvectors for degenerate eigenvalues are important because they provide insight into the structure and behavior of a matrix. They also play a crucial role in solving systems of differential equations and finding the dominant modes of a system.

4. How do you find eigenvectors for degenerate eigenvalues?

To find eigenvectors for degenerate eigenvalues, you need to first determine the eigenvalues of the matrix. Then, for each eigenvalue, you can use the standard method of solving for the eigenvectors by setting up and solving a system of equations.

5. Can there be more than one set of eigenvectors for degenerate eigenvalues?

Yes, there can be multiple sets of eigenvectors for degenerate eigenvalues. This is because each eigenvalue with a multiplicity greater than one will have multiple corresponding eigenvectors. These sets of eigenvectors will be linearly independent and span the same subspace.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
812
  • Linear and Abstract Algebra
Replies
3
Views
939
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Advanced Physics Homework Help
Replies
13
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
527
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
883
Back
Top