# Compute e^(itH)

1. Sep 7, 2014

### bowlbase

1. The problem statement, all variables and given/known data
Compute, using wolfram is an option, eitH where H is:

$H = \begin{pmatrix} 0 & -1 & 0 & 0 & 0& 0\\ -1 & 0 & -1 & 0 & 0& 0\\ 0 & -1 & 0 & -1 & 0& 0\\ 0 & 0 & -1 & 0 & -1& 0\\ 0 & 0 & 0 & -1 & 0& -1\\ 0 & 0 & 0 & 0 & -1& 0\\ \end{pmatrix}$

2. Relevant equations

3. The attempt at a solution

I tried putting this into wolfram but I couldn't get it to work. So, I then tried to get the eigenvector matrix and it's transpose to do as suggested from this page. But, when I did S-1HS I got another filthy matrix that I couldn't extract any information from.

All of the following steps from this problem seem easy enough, I just can't figure this out.

Thanks for any help.

2. Sep 8, 2014

### Matterwave

If you get the eigenvectors to H and do S-1HS the resulting matrix should be diagonal and consist of its eigenvalues. So in fact all you have to do to get eitH after that is to take the exponential of each of the eigenvalues, and then multiply by S again to rotate it back to the original basis.

3. Sep 8, 2014

### Staff: Mentor

Where did you get this problem? Is it math or physics related?

4. Sep 8, 2014

### ShayanJ

The link lacks a : .Just remove the http// part!
Anyway
A good point you can use, is that the trace of a matrix is independent of the basis you're using to write the matrix in. The trace of your matrix is zero, and it should be zero in any other basis too, including the one that makes the matrix diagonal. And because we know that the diagonal elements are the eigenvalues and their order doesn't matter, we can say that the diagonal form of H should be $\left( \begin{array} \\ &a \ \ &0 \ \ &0 \ \ &0 \ \ &0 \ \ &0 \\&0 &-a &0 &0 &0 &0 \\&0 &0 &b &0 &0 &0\\&0 &0 &0 \ &-b &0 &0\\&0 &0 &0 &0 &c &0\\&0 &0 &0 &0 &0 &-c\end{array} \right)$.
So you should only find three eigenvalues with distinct absolute values and then you can write the diagonal form and H and then you can simply exponentiate it.

5. Sep 8, 2014

### bowlbase

This is for a physics problem but I suppose I could have put this part in math as well. It is Hamiltonian for an electron on a six site chain.

I've gotten eigenvalues from using online calculators, actually doing all of this by hand seems like a huge time sink. I don't get exact values from calculators and I think that maybe that is why I wasn't getting a diagonal matrix before (all of the values were pretty close to zero).

This is what I have then:

The eigenvector matrix:
$S= \begin{pmatrix} -0.232 & 0.232 & 0.521 & 0.521 & -0.418 & 0.418\\ 0.418 & 0.418 & -0.232 & 0.232 & 0.521 & 0.521\\ -0.521 & 0.521 & -0.418 & -0.418 & -0.232 & 0.232\\ 0.521 & 0.521 & 0.418 & -0.418 & -0.232 & -0.232\\ -0.418 & 0.418 & 0.232 & 0.232 & 0.521 & -0.522\\ 0.232 & 0.232 & -0.521 & 0.521 & -0.418 & -0.418\\ \end{pmatrix}$

Multiplied against:

$H_{diagonal} = \begin{pmatrix} e^{-i1.802t} & 0 & 0 & 0 & 0& 0\\ 0 & e^{i1.802t} & 0 & 0 & 0& 0\\ 0 & 0 & e^{-i0.445t} & 0 & 0& 0\\ 0 & 0 & 0 & e^{i0.445t} & 0& 0\\ 0 & 0 & 0 & 0 & e^{-i1.247t}& 0\\ 0 & 0 & 0 & 0 & 0& e^{i1.247t}\\ \end{pmatrix}$

then multipled against the inverse of S

$S^{-1}= \begin{pmatrix} -0.232 & 0.418 & -0.521 & 0.521 & -0.418 & 0.232\\ 0.232 & 0.418 & 0.521 & 0.521 & 0.418 & 0.232\\ 0.521 & -0.232 & -0.418 & 0.418 & 0.232 & -0.521\\ 0.521 & 0.232 & -0.418 & -0.418 & 0.232 & 0.521\\ -0.418 & 0.521 & -0.232 & -0.232 & 0.521 & -0.418\\ 0.418 & 0.521 & 0.232 & -0.232 & -0.521 & -0.418\\ \end{pmatrix}$

And, after all that, I'm done?

6. Sep 8, 2014

### Matterwave

You should be done, if that was done correctly, yes. haha

7. Sep 8, 2014

### AlephZero

If you are trying to get an "exact" result rather than doing it numerically, the fact that H2 is the diagonal matrix (1,2,2,2,2,1) should simplify things quite a bit.

You could just write it out as an infinite series, group the odd and even powers together, and then sum the two series of diagonal matrices as in the first example in your link.

8. Sep 8, 2014

### bowlbase

So, then I would just need to sum $\frac{(1t)^n}{n!}$ and $\frac{(2t)^n}{n!}$ from 0 to infinity. giving me a diagonal matrix of just
I incorrectly wrote the exponential before, it should have been $e^{-iHt}$. Forgot the negative.

$H_{diagonal} = \begin{pmatrix} e^{-it} & 0 & 0 & 0 & 0& 0\\ 0 & e^{-i2t} & 0 & 0 & 0& 0\\ 0 & 0 & e^{-i2t} & 0 & 0& 0\\ 0 & 0 & 0 & e^{-i2t} & 0& 0\\ 0 & 0 & 0 & 0 & e^{-i2t}& 0\\ 0 & 0 & 0 & 0 & 0& e^{-it}\\ \end{pmatrix}$

If that's the case the H2 property is something worth knowing. I don't recall this from linear algebra (I don't recall a lot, actually).

9. Sep 9, 2014

### AlephZero

That's the right idea but not quite the right answer. Let $D$ be the diagonal matrix $\{1,\sqrt{2},\dots, 1\}$ so $D^2 = H^2$.

$e^{-iHt} = \cos Ht - i\sin Ht$.

For the real part, $\cos Ht = 1 - \frac{H^2t^2}{2!} + \dots = 1 - \frac{D^2t^2}{2!} + \dots$ so the sums are cosine functions of the diagonals of $D$.

For the imaginary part, $\sin Ht = Ht - \frac{H^3t^3}{3!} + \dots = H\left(t - \frac{D^2t^3}{3!} + \dots\right)$.
To make that into a sine function of D, you have tweak it into
$HD^{-1}\left(Dt - \frac{D^3t^3}{3!} + \dots\right)$.

10. Sep 9, 2014

### AlephZero

I don't think that property has a "special" name. I spotted it from the chess-board pattern of zeros and nonzeros in $H$. That means there will be a pattern of zero terms in $H^2$, whatever value the non-zero terms in $H$ have.

11. Sep 10, 2014

### vela

Staff Emeritus
Mathematica yields a different result for H2:
$$H^2 = \begin{bmatrix} 1 & 0 & 1 & 0 & 0 & 0 \\ 0 & 2 & 0 & 1 & 0 & 0 \\ 1 & 0 & 2 & 0 & 1 & 0 \\ 0 & 1 & 0 & 2 & 0 & 1 \\ 0 & 0 & 1 & 0 & 2 & 0 \\ 0 & 0 & 0 & 1 & 0 & 1 \end{bmatrix}$$