# Time Evolution and Hamiltonian Problem

1. Nov 17, 2009

### phil ess

1. The problem statement, all variables and given/known data

Consider a physical system with a three-dimensional state space. In this space the Hamiltonian is represented by the matrix:

$$H = hbar\omega $\left( \begin{array}{ccc} 0 & 0 & 2 \\ 0 & 1 & 0 \\ 2 & 0 & 0 \end{array} \right)$$$

The state of the system at t = 0 in coordinate representation is:

$$s(t=0) = $\left( \begin{array}{ccc} \sqrt{2} \\ 1 \\ 1 \end{array} \right)$$$

Find the state s(t=/=0) by doing the following steps:

i) Find the eigenvectors and eigenvalues of the Hamiltonian.
ii) Expand the initial state in eigenstates of the Hamiltonian.
iii) Use your knowledge of the time evolution of the eigenstates to find the state of the system s(t).

3. The attempt at a solution

i) I can get the eigenvalues:

$$$\chi(\lambda) = \left| \begin{array}{ccc} -\lambda & 0 & 1 \\ 0 & 2-\lambda & 0 \\ 1 & 0 & -\lambda \end{array} \right| = 0$$$

which gives:

$$\lambda = 2,-2,1$$

and then the eigenvectors are:

$$$\left( \begin{array}{ccc} 1 \\ 0 \\ 1 \end{array} \right)\[ \left( \begin{array}{ccc} 1 \\ 0 \\ -1 \end{array} \right)$ \] $\left( \begin{array}{ccc} 0 \\ 1 \\ 0 \end{array} \right)$$$

Is this right?

ii) Now Im not sure how exactly to expand the initial states in eigenstates of the hamiltonian

Any hints are appreciated!!

2. Nov 17, 2009

### turin

Basically, yes. But anyway, you really should know how to check for yourself. Just apply H to each of these vectors. If the result is proportional to the original, then it is an eigenvector. And, the proportionality factor is the eigenvalue. (There is one slight issue: eigenvectors are typically defined to be unit-normalized.)

Use the fact the the eigenvectors form a complete 3-D basis. (Well, first you should verify that this is true.) So, you can construct an identity matrix out of projection matrices, which you can in turn construct from the eigenvectors. (This is where the unit-normalization comes in handy.) Then, Iv=v, where I is the identitiy matrix and v is any vector. Are you familiar with bra-ket notation?

3. Nov 18, 2009

### phil ess

Right ok so first I normalize the eigenvectors:

$$$\left( \begin{array}{ccc} 1/\sqrt{2} \\ 0 \\ 1/\sqrt{2} \end{array} \right) \[ \left( \begin{array}{ccc} 1/{\sqrt{2} \\ 0 \\ -1/\sqrt{2} \end{array} \right)\[ \left( \begin{array}{ccc} 0 \\ 1 \\ 0 \end{array} \right)$\]\]$$

And now I want to construct projection matricies from these? Ok as far as I know a projection matrix has to be hermitian and idempotent right? Or does it only have to be hermitian for an orthogonal projection matrix?

Well either way this is what I put together:

$$A = $\left( \begin{array}{ccc} 1/\sqrt{2} & 0 & 1/\sqrt{2} \\ 0 & 1 & 0 \\ 1/\sqrt{2} & 0 & -1/\sqrt{2} \end{array} \right)$$$

Which is hermitian, but it certainly isn't idempotent, since A2 = I

Obviously Im missing something important! Thanks for the help so far!

4. Nov 18, 2009

### turin

Use the fact that the eigenvectors are orthonormal (and again, you should verify this if you have not done so already). I'll call them v+1, v+2, and v-2. Then, for example:

v+1 ( v+1v+1 ) = v+1 (1) = v+1

v+1 ( v+1v+2 ) = v+1 (0) = 0

v+1 ( v+1v-2 ) = v+1 (0) = 0

So, v+1 ( v+1v ) projects v onto the v+1 "direction". You can write this as a 3x3 matrix acting on v. That matrix is the projection matrix for the eigenvector v+1. You can find two more matrices for the other two eigenvectors in the same way. (The reason that I asked you about bra-ket notation is that it has a very simple notational implementation: e.g. |+1><+1| is the projector for |+1>.) So, you should find 3 projection matrices. The sum of these three projections matrices will actually be the 3x3 identity matrix! This is a very important concept.

BTW, please let me know if the notation is difficutl to read; if so, I will switch to LaTeX.

5. Nov 19, 2009

### phil ess

I understand what youre saying when you say "v+1 ( v+1 ⋅ v ) projects v onto the v+1 direction", but I dont know how to write this as a 3x3 matrix.

The way Ive been taught to make a projection matrix is using:

Proj V x = A(ATA)-1AT x

Where A is a matrix constructed from basis vectors of V, and the eigenvectors I have are basis vectors for R3.

But When I do this I just get the identity matrix back which obviously isnt right. Could you explain how you construct the projection matrix your way? Thanks for all the help so far Im stll struggling with this..

EDIT: "The sum of these three projections matrices will actually be the 3x3 identity matrix! This is a very important concept"

Oh is this why I got the identity matrix when I used the formula above?

6. Nov 19, 2009

### turin

Write those relations that I gave to you in component form. Hint: the dot product can be written in component form as:

v.w = ∑jvjwj

and the action of a matrix on a vector can be written in component form as:

Av → ∑jAijvj

So, the row vector is like a row of a matrix ...

Possibly you only need to recognize that V = ℝ3 in order to resolve your confusion. (Just in case your browser doesn't support that one, that is V = R3.) I'm not sure what exactly you are projecting onto there (V is some subspace?), or how you are constructing A "from the basis vectors of V". But, anyway, the method that I suggest is a much simpler concept, once you can figure it out.

Probably. It isn't clear to me what you've done so far.