# Help with solving for <f |H f >

1. Oct 8, 2014

### talabax

H =
|d 0 |
|0 f |

f =
|e^-ai(cos(gt/h) |
|e^-ai(-isin(gt/h)|

I am stuck here i have found the eigenvectors to be for d
| 1 |
| 0 |

and for f
| 0 |
| 1 |

but now what do i do with the eigenvectors do i do a linear combination and then normalize?

then do
f | 1 | * f
| 1 |

would the probality be 1/2 for each eigen vector if i normalized the superposition?

Help im so lost!

|

Last edited: Oct 8, 2014
2. Oct 8, 2014

### Einj

I don't know if I'm understanding your question (you should try to type in LaTeX). Anyways, here's what I understood. You have an Hamiltonian:
$$H=\left(\begin{array}{cc} d & 0 \\ 0 & f \end{array}\right)$$

and a state:
$$|\psi\rangle=\left(\begin{array}{c} e^{-ia\cos(gt/\hbar)} \\ e^{-ia(-i\sin(gt/\hbar))} \end{array} \right) =e^{-ia\cos(gt/\hbar)}|d\rangle+e^{-a\sin(gt/\hbar)}|f\rangle$$
where with $|d\rangle$ and $|f\rangle$ I mean the two eigenstates of the Hamiltonian having eigenvalues d and f. First of all, you have to normalize your state. You have:
$$\langle\psi|\psi\rangle=1+e^{-2a\sin(gt/\hbar)},$$
and so, the normalized state is:
$$|\psi\rangle=\frac{e^{-ia\cos(gt/\hbar)}|d\rangle+e^{-a\sin(gt/\hbar)}|f\rangle}{\sqrt{1+e^{-2a\sin(gt/\hbar)}}}.$$
Then, by definition of eigenvectors:
$$H|\psi\rangle=|\psi\rangle=\frac{de^{-ia\cos(gt/\hbar)}|d\rangle+fe^{-a\sin(gt/\hbar)}|f\rangle}{\sqrt{1+e^{-2a\sin(gt/\hbar)}}},$$
so that:
$$\langle\psi|H|\psi\rangle=\frac{d+fe^{-2a\sin(gt/\hbar)}}{1+e^{-2a\sin(gt/\hbar)}}.$$
Is this what you were looking for?

3. Oct 8, 2014

4. Oct 8, 2014

### talabax

h = \begin{pmatrix} d & 0 \\ 0 & f \end{pmatrix} ψ = \begin{pmatrix} (e^-ia)cos(gt/h \\ -(e^-ia)isin(gt/h) \end{pmatrix}

5. Oct 8, 2014

### Einj

ok, does my solution work for you?

6. Oct 8, 2014

### talabax

i think so its the only help i can find at the moment you have any links to help me solve these problems?

7. Oct 8, 2014

### Einj

Well, not really. This is simply how QM works, it's mostly linear algebra.

8. Oct 8, 2014

### Simon Bridge

The solution is arrived at by expanding the state-vector in terms of the eigenvectors of the Hamiltonian ... then applying the definition of "expectation value". We would not normally just do the problem for you even though this is not a homework forum - you learn best by doing.

It's ends up being a matter of just writing the expressions out one after the other and then reading them off.
$\renewcommand{\ket}[1]{\left| #1 \right\rangle} \renewcommand{\bra}[1]{\left\langle #1 \right|} \renewcommand{\bket}[1]{\left\langle #1 \right\rangle}$

i.e. is |a> and |b> are eigenvectors of operator X then X|a>=a|a> and X|b>=b|b> and we can write the operator X in terms of the eigenvectors like this: $$X=\begin{pmatrix} a & 0\\ 0 & b\end{pmatrix}$$ ... Using that basis, you can see that $$\ket{a}=\begin{pmatrix}1\\0\end{pmatrix}\quad \ket{b}=\begin{pmatrix}0\\1\end{pmatrix}$$ ... are orthonormal.

An arbitrary state can be expanded as a sum over these basis kets.

Thus $\ket\psi = \psi_a\ket a + \psi_b \ket b = \begin{pmatrix} \psi_a\\ \psi_b \end{pmatrix}$

That way $\ket{X\psi} = X\ket\psi = a\psi_a\ket a + b\psi_b \ket b$

And The expectation value follows from the definition: $$\bket{X} = \bra{\psi}X\ket{\psi}$$ ... but this assumes that $\ket\psi$ is normalized already.

Last edited: Oct 8, 2014
9. Oct 8, 2014

### talabax

that makes tons of sense now! for
a ∣a⟩ is the value of the first a the value from X

10. Oct 8, 2014

### Simon Bridge

By definition $X\ket a = a\ket a$ ... that is what "|a> is an eigenvector of X" means.
Since I made that definition at the start, I have to keep it right to the end. So:
$$X\big(\psi_a\ket a\big) = \psi_a X\ket a = \psi_a a \ket a = a\psi_a\ket a$$
I can do this because $\psi_a$ is not a vector (notice it is not in a ket?) and X can only operate on vectors.

Last edited: Oct 8, 2014
11. Oct 8, 2014

### talabax

ok so if i had \begin{pmatrix} 3 & 0\\ 0 & 2 \end{pmatrix}

i would have 3ψa∣a⟩ + 2ψb∣a⟩

12. Oct 9, 2014

### Simon Bridge

That's the one :)
It works the same as the linear algebra you may remember.
All that stuff about vector spaces and basis sets.