Eigenvectors for degenerate eigenvalues

Click For Summary

Discussion Overview

The discussion revolves around the calculation of eigenvectors for a matrix with degenerate eigenvalues, specifically focusing on the eigenvalue 1. Participants explore methods of finding eigenvectors, the implications of orthogonality in quantum mechanics, and the relationship between Gaussian elimination and back substitution in solving for eigenvectors.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions the effectiveness of back substitution for finding eigenvectors associated with the repeated eigenvalue 1, suggesting that Gaussian elimination produces results where back substitution does not.
  • Another participant proposes a method to convert two linearly independent vectors into an orthogonal pair, indicating that orthogonality is a preference rather than a requirement in quantum mechanics.
  • A participant explains that the equation derived from the eigenvalue equation leads to a single equation with three unknowns, allowing for two of those unknowns to be treated as arbitrary values.
  • Further elaboration on the same equation shows how to express one variable in terms of the others, leading to a representation of the eigenvectors as linear combinations of two specific vectors.
  • Participants discuss the geometric interpretation of the solutions, noting that the two vectors span a plane in R3, indicating a double infinity of solutions due to the arbitrary nature of the chosen constants.

Areas of Agreement / Disagreement

Participants generally agree on the method of expressing eigenvectors in terms of arbitrary constants but do not reach a consensus on the necessity of orthogonality for eigenvectors in quantum mechanics. The effectiveness of back substitution versus Gaussian elimination remains a point of inquiry without a definitive resolution.

Contextual Notes

The discussion highlights the potential limitations of back substitution in certain contexts and the dependence on the specific structure of the equations derived from the eigenvalue problem. There is also an acknowledgment of the ambiguity in the requirement for orthogonality in the context of quantum mechanics.

dyn
Messages
774
Reaction score
63
I am looking at some notes on Linear algebra written for maths students mainly to improve my Quantum Mechanics. I came across the following example - $$ \begin{pmatrix} 2 & -3 & 1 \\ 1 & -2 & 1 \\ 1 & -3 & 2 \end{pmatrix} $$
The example then gives the eigenvalues as 0 and 1(doubly degenerate). It then calculates the eigenvectors using Gaussian elimination. This is where my problem arises - coming from a physics background I tried to find the eigenvectors for the repeated eigenvalue 1 using back substitution but it doesn't seem to produce a solution this way. Am I doing something wrong or is it possible for back substitution not to work while Gaussian elimination works ?
The answer given for the eigenvector is a linear combination of the 2 vectors ( 3 1 0 )T and (-1 0 1)T. In the Quantum Mechanics textbook I am using it says for degenerate eigenvalues to choose 2 mutually orthogonal vectors. The 2 vectors I have listed are not orthogonal. Is the orthogonal part just a preference for QM and not a requirement ?
Thanks
 
Physics news on Phys.org
You can turn them into an orthogonal pair by subtracting from one the projection of the other onto it.

Given two linearly independent vectors ##\vec u,\vec v##, the pair ##\vec u-\frac{\vec u\cdot \vec v}{\vec v\cdot\vec v}\vec v, \vec v## is orthogonal. You can check that by calculating ##(\vec u-\frac{\vec u\cdot \vec v}{\vec v\cdot\vec v}\vec v)\cdot \vec v##
 
So choosing the eigenvectors as orthogonal is just a matter of preference. Thanks. Any thoughts on why I can't calculate the eigenvectors by back substitution but it can be done by Gaussian elimination ?
 
If I apply a general vector ( a b c )T to the eigenvalue equation with eigenvalue 1 , I end up with 3 equations exactly the same a-3b+c=0. How do I then proceed to end up with the answer given which is equivalent to ( 3x-y , x , y )T
 
The equation 'a-3b+c=0' can be written as 'a=3b-c' which just says that for any an eigenvector with Eigenvalue 1, whose2nd and 3rd components are b,c, the first component is 3b-c.

Relabel a,b,c as x,y,z and you have the given answer.
 
dyn said:
If I apply a general vector ( a b c )T to the eigenvalue equation with eigenvalue 1 , I end up with 3 equations exactly the same a-3b+c=0. How do I then proceed to end up with the answer given which is equivalent to ( 3x-y , x , y )T
Elaborating on what andrewkirk said, relabel the equation above as x - 3y + z = 0.

Then
x = 3y - z
y = y
z = ... z
If you look at the right sides as a sum of two vectors, you get
##\begin{bmatrix} x \\ y \\ z \end{bmatrix} = y\begin{bmatrix} 3 \\ 1 \\ 0 \end{bmatrix} + z\begin{bmatrix} -1 \\ 0 \\ 1\end{bmatrix}##

Here y and z on the right side can be considered arbitrary constants.
 
Thanks for your replies. So essentially because I end up with 3 equations that are the same I really have just one equation with 3 unknowns. So I take 2 of those unknowns to have arbitrary values and the express the remaining unknown in terms of the 2 arbitrary values
 
dyn said:
Thanks for your replies. So essentially because I end up with 3 equations that are the same I really have just one equation with 3 unknowns. So I take 2 of those unknowns to have arbitrary values and the express the remaining unknown in terms of the 2 arbitrary values
Yes. In the work I showed, you can take y = 1 and z = 0, and get one solution, and you can take y = 0, z = 1, to get another solution. Since y and z are completely arbitrary, you get a double infinity of solutions.

Geometrically, the two vectors I showed determine a plane in R3. Every point in this plane is some linear combination of those two vectors.
 
  • Like
Likes   Reactions: dyn

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K