Solving QM-like Problem (Shankar, Coupled Mass)

  • Context: Graduate 
  • Thread starter Thread starter Astrum
  • Start date Start date
  • Tags Tags
    Coupled Mass Shankar
Click For Summary
SUMMARY

This discussion focuses on solving a coupled mass problem using quantum mechanics techniques as outlined in Shankar's mathematical introduction. The equations derived include the state representation |x(t)⟩ and the matrix Ω, which is defined as $$\Omega = \begin{bmatrix} -\frac{2k}{m} & \frac{k}{m} \\ \frac{k}{m} & -\frac{2k}{m} \end{bmatrix}$$. Participants discuss the process of finding eigenvalues and eigenvectors of the matrix Ω, emphasizing the importance of the determinant equation $$\det(\Omega - \omega I) = 0$$. The conversation also highlights the projection of initial conditions onto the orthogonal basis kets |I⟩ and |II⟩.

PREREQUISITES
  • Understanding of quantum mechanics concepts, specifically state vectors and inner products.
  • Familiarity with linear algebra, particularly eigenvalues and eigenvectors.
  • Knowledge of matrix operations and determinants.
  • Basic understanding of harmonic motion and coupled systems.
NEXT STEPS
  • Study the process of diagonalizing matrices, particularly in the context of quantum mechanics.
  • Learn about eigenvalue problems and their applications in physical systems.
  • Explore Shankar's "Principles of Quantum Mechanics" for deeper insights into the mathematical framework.
  • Practice solving coupled differential equations using matrix methods.
USEFUL FOR

Students and professionals in physics, particularly those focusing on quantum mechanics and linear algebra, as well as anyone interested in solving coupled mass problems using mathematical techniques.

Astrum
Messages
269
Reaction score
5
This is solving a coupled mass problem using the techniques used in QM, it's in the mathematical introduction of Shankar.

This equation was obtained by using the solution for ##x(t) = x_i(0)cos(\omega _i t)## and plugging it into ##\left| x(t) \right \rangle = \left| I \right \rangle x_i (t)+\left| II \right \rangle x_{II} (t)##

$$\left| x(t) \right \rangle = \left| I \right \rangle x_1 (0)cos(\omega _1 t) + \left| II \right \rangle x_{II} (0) cos(\omega _{II} t)$$

Where the kets of I and II are an orthogonal basis, and this turns into:

$$\left| I \right \rangle \langle I \left| x(0) \right \rangle cos(\omega _I t) + \left| II \right \rangle \langle II \left| x(0) \right \rangle cos(\omega _{II} t) $$

Where did these inner products come from?

Edit: ##\langle I \left| x(0) \right \rangle## is just the projection of ##x(0)## onto the basis of ##\left|I \right \rangle##, right? So this is just reworking the equation in terms of ##x(0)##?

Since I think I've figured out my original question, I'd like to pose a new one. My linear algebra isn't very strong, and I'm having problems with the following.

##\left| \ddot{x}(t) \right \rangle = \Omega \left| x(t) \right \rangle##, $$\Omega = \begin{bmatrix} -\frac{2k}{m} & \frac{k}{m} \\ \frac{k}{m} & -\frac{2k}{m} \end{bmatrix} $$

We want to use the basis that diagnolizes ##\Omega##, we need to find it's eigenvectors.

##\Omega \left| I \right \rangle = - \omega ^2 \left| I \right \rangle ##

How does one go about finding the eigenvalues and eigenvectors? the general formula is ##\det(\Omega - \omega I)=0##.
 
Last edited:
Physics news on Phys.org
Yeah, basically what Shankar does is represent the state of the system by |x(t)> and then decompose it onto the basis {|I>. |II>}, where he defines x1(t):=<I|x(t)>. x2(t):=<II|x(t)>. This is basically just the two dimensional version of inserting the identity operator I = Ʃi|i><i| where the |i>'s are the orthonormal basis kets. (Shankar equation 1.6.7, probably one of the most important you'll ever learn.) In that example you only have a two-dimensional phase space, so {|i}}={|I>, |II>}.
 
Last edited:
Jolb said:
Yeah, basically what Shankar does is represent the state of the system by |x(t)> and then decompose it onto the basis {|I>. |II>}, where he defines x1(t):=<I|x(t)>. x2(t):=<II|x(t)>. This is basically just the two dimensional version of inserting the identity operator I = Ʃi|i><i| where the |i>'s are the orthonormal basis kets. (Shankar equation 1.6.7, probably one of the most important you'll ever learn.) In that example you only have a two-dimensional phase space, so {|i}}={|I>, |II>}.

Yes, ##\left| I \right \rangle \langle v \left| I \right \rangle## is the projection of v and I, a scalar, times the ##\left| I \right \rangle## basis.
 
Astrum said:
This is solving a coupled mass problem using the techniques used in QM, it's in the mathematical introduction of Shankar.

This equation was obtained by using the solution for ##x(t) = x_i(0)cos(\omega _i t)## and plugging it into ##\left| x(t) \right \rangle = \left| I \right \rangle x_i (t)+\left| II \right \rangle x_{II} (t)##

$$\left| x(t) \right \rangle = \left| I \right \rangle x_1 (0)cos(\omega _1 t) + \left| II \right \rangle x_{II} (0) cos(\omega _{II} t)$$

Where the kets of I and II are an orthogonal basis, and this turns into:

$$\left| I \right \rangle \langle I \left| x(0) \right \rangle cos(\omega _I t) + \left| II \right \rangle \langle II \left| x(0) \right \rangle cos(\omega _{II} t) $$

Where did these inner products come from?

Edit: ##\langle I \left| x(0) \right \rangle## is just the projection of ##x(0)## onto the basis of ##\left|I \right \rangle##, right? So this is just reworking the equation in terms of ##x(0)##?

Since I think I've figured out my original question, I'd like to pose a new one. My linear algebra isn't very strong, and I'm having problems with the following.

##\left| \ddot{x}(t) \right \rangle = \Omega \left| x(t) \right \rangle##, $$\Omega = \begin{bmatrix} -\frac{2k}{m} & \frac{k}{m} \\ \frac{k}{m} & -\frac{2k}{m} \end{bmatrix} $$

We want to use the basis that diagnolizes ##\Omega##, we need to find it's eigenvectors.

##\Omega \left| I \right \rangle = - \omega ^2 \left| I \right \rangle ##

How does one go about finding the eigenvalues and eigenvectors? the general formula is ##\det(\Omega - \omega I)##.
Well the general formula is ##\det(\Omega - \omega I)=0##
The more straightforward way to attack this problem is to simply write down what the definition of an eigenvector and then solve for its elements. [Once you have solved for the elements, plug each back into find their corresponding eigenvalues.]

So for your 2x2 matrix Ω, you should set up the matrix equation
\Omega\begin{pmatrix}a \\ b\end{pmatrix}=\omega\begin{pmatrix}a \\ b\end{pmatrix} for a constant \omega. This is the definition of an eigenvector of \Omega. [\omega is called the "eigenvalue", and in general there is one eigenvalue for each eigenvector.] If you plug in your matrix for \Omega and do the matrix multiplication, you'll get two equations for a and b. Solve this system of equations and you should be able to find the eigenvectors. Solving this system of equations is equivalent to solving that determinant equation you said earlier, but if you don't understand determinants very well, it is much more intuitive to do it in this straightforward manner.
 
Last edited:
Jolb said:
Well the general formula is ##\det(\Omega - \omega I)=0##
The more straightforward way to attack this problem is to simply write down what the definition of an eigenvector and then solve for its elements. [Once you have solved for the elements, plug each back into find their corresponding eigenvalues.]

So for your 2x2 matrix Ω, you should set up the matrix equation
\Omega\begin{pmatrix}a \\ b\end{pmatrix}=\omega\begin{pmatrix}a \\ b\end{pmatrix} for a constant \omega. This is the definition of an eigenvector of \Omega. [\omega is called the "eigenvalue", and in general there is one eigenvalue for each eigenvector.] If you plug in your matrix for \Omega and do the matrix multiplication, you'll get two equations for a and b. Solve this system of equations and you should be able to find the eigenvectors. Solving this system of equations is equivalent to solving that determinant equation you said earlier, but if you don't understand determinants very well, it is much more intuitive to do it in this straightforward manner.

Doing the matrix multiplication -

$$\begin{bmatrix} (-2k/m)a + (k/m)b \\ (k/m)a-(2k/m)b \end{bmatrix} = -\omega ^2 \begin{bmatrix} a \\ b \end{bmatrix}$$

This gives us two equations with 3 unknowns, and there is only one eigenvalue here. This doesn't make sense to me.
 
Alright, well I'll try and show you how it's done. To make my life easier, I'll do the equivalent problem of finding the eigenvalues and eigenvectors of the matrix:\begin{bmatrix}-2 &amp;&amp; 1 \\ 1 &amp;&amp; -2\end{bmatrix}

I'll write the eigenvalue equation as<br /> \begin{bmatrix}-2 &amp;&amp; 1 \\ 1 &amp;&amp; -2\end{bmatrix}\begin{pmatrix}a \\ b\end{pmatrix} = c \begin{pmatrix} a \\ b\end{pmatrix}.
The bottom line of the matrix equation gives a-2b=cb. Thus a=(c+2)b. So we have shown that the vector \begin{pmatrix}(c+2)b \\ b\end{pmatrix} should be an eigenvector. So we can plug this back into the eigenvalue equation:
c\begin{pmatrix}(c+2)b \\ b\end{pmatrix}=\begin{bmatrix}-2 &amp;&amp; 1 \\ 1 &amp;&amp; -2\end{bmatrix}\begin{pmatrix}(c+2)b \\ b\end{pmatrix} =\begin{pmatrix}-2cb-3b\\ cb \end{pmatrix}

Reading across the top, we see the equation c(c+2)b=-2cb-3bOr0=c^2+4c+3=(c+3)(c+1) so the two eigenvalues are c=-3 and c=-1. Now we plug these two values back into the eigenvalue equation to find the eigenvectors corresponding to each eigenvalue. For c=-3 we have:
<br /> \begin{bmatrix}-2 &amp;&amp; 1 \\ 1 &amp;&amp; -2\end{bmatrix}\begin{pmatrix}a \\ b\end{pmatrix} = -3 \begin{pmatrix} a \\ b\end{pmatrix} which implies b=-a. This means any vector of the form \begin{pmatrix}a \\ -a \end{pmatrix} is an eigenvector with eigenvalue -3. We want our states to be normalized though, so we pick a=\frac{1}{\sqrt{2}}.

See if you can find the other eigenvector (the one corresponding to c=-1) for yourself, and then see if you can use your answers to solve the original problem with the k's and m's.
 
Last edited:
  • Like
Likes   Reactions: 1 person

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K