Mathematica Eigenvectors 4x4 Matrix in Mathematica

AI Thread Summary
The discussion centers on calculating the eigenvectors of a 4x4 matrix in Mathematica without including the eigenvalues in the solution. The user initially attempts to use RowReduce on the matrix H_F minus a variable identity matrix, but finds the output unhelpful. They explore the idea of calculating the inverse of (H_F - λI) but realize it complicates the process due to polynomial dependencies. A suggestion is made to rewrite the matrix in a new basis to simplify the problem, and the conversation highlights the complexity of eigenvalue calculations, especially when considering different values for parameters like A_0. Ultimately, the discussion emphasizes the challenges of obtaining a clean output for eigenvectors as functions of eigenvalues in Mathematica.
DeathbyGreen
Messages
83
Reaction score
15
Hi,

I'm trying to calculate the eigenvectors of a 4x4 matrix, but I don't want the actual eigenvalues included in the solution, I simply want them listed as a variable. For example, I have the matrix:

<br /> H_F =<br /> \left[<br /> \begin{array}{cccc}<br /> \hbar\Omega&amp;\hbar v_fk_- &amp;0&amp;0\\<br /> \hbar v_fk_+&amp;\hbar\Omega&amp;\frac{v_fe}{c}A_0 &amp;0\\<br /> 0&amp;\frac{v_fe}{c}A_0 &amp;0&amp;\hbar v_fk_- \\<br /> 0&amp;0&amp;\hbar v_fk_+ &amp;0\\<br /> \end{array}<br /> \right]<br />

My attempt at a solution was just plugging in H_F-\epsilon\bf{I} with \bf{I} the identity matrix, and then using RowReduce. However, this only gives me the identity matrix, which is not the answer I'm looking for. If I just use H_F and use Eigenvectors[H_F] then I get a huge, essentially useless mess of variables. I would like the output to list the eigenvector as a function of a variable which represents the eigenvalue. Is there any way to do this? A code I was using is (with some variable substitutions for easier entry):

Code:
RowReduce((h*w-l,h*v*x,0,0),(h*v*y,hw-l,m,0),(0,m,-l,h*v*x),(0,0,h*v*y,-l))
 
Last edited:
Physics news on Phys.org
I guess we can assume that ##H_F## is an isomorphism, at least it looks like one. So one possibility is to simply calculate ##H_F^{-1}##.

As ##0## isn't an eigenvalue, ##H_F\,x = \lambda x## results in ##x_1 \sim x_2\, , \,x_3 \sim x_4## and with this ##x_2 \sim x_4## and ##x_1 \sim x_3##. In total this means ##x_1 \sim x_2 \sim x_3 \sim x_4##. The proportions might be a bit compicated, but not impossible to calculate. The matrix has only ##4## different entries.
 
  • Like
Likes DeathbyGreen
Thank you for the response. Could explain how calculating the inverse would help?
 
DeathbyGreen said:
Thank you for the response. Could explain how calculating the inverse would help?
Sorry, that was wrong and stupid. I thought inverting the equation would do the trick, but it doesn't. One would need the inverse of ##(H_F-\lambda I)## which is indeed unpleasant considering the polynomials in ##\lambda##. So, sorry for this. But the direct calculation doesn't seem to be too complicated. I wrote it as
$$
\begin{bmatrix}a&b&0&0\\c&a&d&0\\0&d&0&b\\0&0&c&0\end{bmatrix}\cdot \begin{bmatrix}x_1\\x_2\\x_3\\x_4\end{bmatrix} = \begin{bmatrix}\lambda x_1\\\lambda x_2\\\lambda x_3\\\lambda x_4\end{bmatrix}
$$
which was easy to solve, especially if we may assume all variables to be unequal zero and the ##a,b,c,d## share common factors.
 
  • Like
Likes DeathbyGreen
No worries! I appreciate you taking a look at it. Maybe it's best to just solve it by hand using the equation you posted. I was hoping there would be a simple way to plug it into Mathematica.
 
DeathbyGreen said:
No worries! I appreciate you taking a look at it. Maybe it's best to just solve it by hand using the equation you posted. I was hoping there would be a simple way to plug it into Mathematica.
I don't know Mathematica (anymore), so maybe you could do it in the notation with ##a,b,c,d##. It's a least shorter. But in this case playing around with the program will probably take longer than the few equations will take. The last one will get you rid of ##x_4## immediately, so there are only three variables left. Same for the first row.
 
  • Like
Likes DeathbyGreen
DeathbyGreen said:
I would like the output to list the eigenvector as a function of a variable which represents the eigenvalue. Is there any way to do this? A code I was using is (with some variable substitutions for easier entry):
Strictly speaking, no, since there is a discrete set of eigenvectors corresponding to a discrete set of eigenvalues. Trying to write this as a function implies a continuous set of eigenvalues. But maybe I have misunderstood your question.
DeathbyGreen said:
If I just use HFHFH_F and use Eigenvectors[HFHFH_F] then I get a huge, essentially useless mess of variables.
This is not surprising since the general element-wise expression for the eigenvectors and eigenvalues of a 4x4 matrix is very large.
 
  • Like
Likes DeathbyGreen
Maybe I didn't explain it well enough. So I rewrite the 4x4 matrix in a new basis (formed from eigenvectors corresponding to degenerate eigenvalues) which is a 2x2 matrix. I have solved for the eigenvalues of the 2x2 matrix. What I want to do is take those eigenvalues of the 2x2, and plug them into the 4x4 matrix eigenvector equation to get a two state solution. I wanted to leave the eigenvalues represented as \epsilon during the solution process to make the algebra easier because even the 2x2 eigenvalues are pretty nasty.
 
DeathbyGreen said:
So I rewrite the 4x4 matrix in a new basis (formed from eigenvectors corresponding to degenerate eigenvalues) which is a 2x2 matrix.
Are you saying you found eigenvectors of a 2x2 matrix and constructed a 4x4 matrix from them?
 
  • Like
Likes DeathbyGreen
  • #10
No, I solved the eigenvalues of the 2x2 and want to find the eigenvectors of the 4x4 with them

1) set A_0=0 in the 4x4 matrix, solve for eigensystem
2) Take two degenerate branches (degenerate at a k value of k=k_0=\frac{\Omega}{2v_F} with k in polar coordinates) and rewrite the 4x4 into a 2x2 A_0\neq0
3) solve for eigenvalues of the 2x2
4) Solve for the eigenvectors of the 4x4 matrix again by using the eigenvalues of the 2x2 with A_0\neq0
 
  • #11
Alright, but I don't think the eigenvalues will be the same for the cases ##A_{0}=0## and ##A_{0}\neq 0##. We know the eigenvalues can be determined by solving
$$\text{det}(H_{F}-\lambda I)=0$$
This determinate will be a lengthy expression but it should still depend on the value of ##A_{0}## so I imagine the eigenvalues will also depend on ##A_{0}##.
 

Similar threads

Replies
2
Views
2K
Replies
1
Views
2K
Replies
3
Views
2K
Replies
14
Views
2K
Replies
12
Views
2K
Replies
4
Views
4K
Back
Top