# QM and Linear Algebra

Homework Helper
Just trying to understand a few concepts first. If H is a Hamiltonian operator, then H is characteristic of the system, and changes from system to system? Moreover, if you have some wavefunction f, then would <f|H|f> be the energy of the state corresponding to f? Suppose $\psi _k$ are solutions to the time indpendent Schrödinger equation:

$$H(\psi _k) = E_k \psi _k$$

Is it true that $\frac{\partial }{\partial t}\psi _k = 0$? Is it also true that:

$$H\left (\exp \left (\frac{-iE_k (t - t_0)}{\hbar }\right )\psi _k\right ) = \exp \left (\frac{-iE_k (t-t_0)}{\hbar }H(\psi _k)$$

Also, the dynamics of the system are described by the wave function that satisfies:

$$H(\psi (t)) = i\hbar \frac{\partial }{\partial t}\psi (t)$$

and it's not as though that's a definition for H, right?

Next, is a Hamiltonian Hermitian if and only if it's eigenfunctions span the Hilbert space? Is the previou sentence at least partially true, if not entirely?

Now, some problems:

-------------

Assume that H, A and B are Hermitian operators on a finite-dimensional Hilbert space.

1. Show that if [H,A] = 0, then A and H have a complete set of eigenfunctions in common, i.e. there exists a basis $\{\psi _{\alpha i}\}$ for the Hilbert space such that:

$$H\psi _{\alpha i} = E_{\alpha }\psi _{\alpha i},\ A\psi _{\alpha i} = a_i\psi _{\alpha i}$$

Since the operators are Hermitian, I think it suffices to show that for some choice of eigenfunctions, every eigenfuntion of H is an eigenfunction of A. I don't really know how to do this. I've already figured that if f is an eigenfunction for A corresponding to value a, then:

HAf = Haf = aHf

Since HA = AH, it's also true that:

AHf = aHf, so Hf is an eigenfunction of A corresponding to the same eigenvalue as f. Can we pick the f so that Hf = Ef?

2. Show that if A and B are both symmetries of H and [A,B] = 0, then it is possible to construct a basis for the Hilbert space with a triple set of labels $\psi _{\alpha i j}$ such that:

$$H\psi _{\alpha i j} = E_{\alpha }\psi _{\alpha i j},\ A\psi _{\alpha i j} = a_i\psi _{\alpha i j},\ B\psi _{\alpha i j} = b_j\psi _{\alpha i j}$$

I'm guessing that the eigenfunctions of H span the Hilbert space (as I asked before, does this follow from the fact that it's Hermitian?), so [H,A] = [H,B] = 0. So from the previous problem, we know that each pair of operators has a basis in common. Although I don't know how to show that they can all share the same basis. I think if I knew how to do question 1, I might know how to do this one.

3. Show that if A and B are both symmetries of H but [A,B] != 0, then it is possible to construct a basis for the Hilbert space with two labelling indices $\{\psi _{\alpha i}$ such that:

$$H\psi _{\alpha i} = E_{\alpha }\psi _{\alpha i},\ A\psi _{\alpha i} = a_i\psi _{\alpha i},\ B\psi _{\alpha i} = \sum _j \psi _{\alpha i}M_{ji}(B)$$

where M(B) is a complex matrix.

.... No idea for this one.

Galileo
Homework Helper
AKG said:
Just trying to understand a few concepts first. If H is a Hamiltonian operator, then H is characteristic of the system, and changes from system to system?
Correct. The standard interpretation is that the wavefunction tells you everything you can know about the system and its evolution is given by the SE (here in position representation):
$$i\hbar \frac{\partial}{\partial t}\psi(\vec r)=\hat H \psi (\vec r)$$
So the Hamiltonian operator $\hat H$ is the thing that is important for the evolution of the system.

Moreover, if you have some wavefunction f, then would <f|H|f> be the energy of the state corresponding to f?
No, it is the EXPECTATION value of the energy. In general the expectation value of an observable Q is <f|Q|f>. Different measurements on the system in the same state |f> can give different values for the energy (unless |f> is an energy eigenstate).

Suppose $\psi _k$ are solutions to the time indpendent Schrödinger equation:

$$H(\psi _k) = E_k \psi _k$$

Is it true that $\frac{\partial }{\partial t}\psi _k = 0$?
From the S.E above, you know EXACTLY how the state will evolve. Plug it in and solve the differential equation. You'll find that $\psi_k(t)$ will only differ from $\psi_k(t_0)$ in a phase factor which has no physical significance.

Also, the dynamics of the system are described by the wave function that satisfies:

$$H(\psi (t)) = i\hbar \frac{\partial }{\partial t}\psi (t)$$

and it's not as though that's a definition for H, right?
No. The S.E. tells you how the state evolves as a function of time if you know H.

Next, is a Hamiltonian Hermitian if and only if it's eigenfunctions span the Hilbert space?
That's a weird viewpoint. The Hamiltonian is hermitian because it's an observable. It is necessary that its eigenfunctions span the space and that its eigenvalues are real. There are operators whose eigenstates span the entire space, but aren't hermitian.

AHf = aHf, so Hf is an eigenfunction of A corresponding to the same eigenvalue as f. Can we pick the f so that Hf = Ef?
Only if f is an eigenfunction of H, which is what you want to show. In general, though, you can't be sure of that, unless the eigenvalue a is nondegenerate. Try assuming that all eigenvalues are nondegenerate, then the proof is a much easier. Then try to generalize the result.

Last edited:
Gokul43201
Staff Emeritus
Gold Member
I second Galileo's advice on #1. First assume non-degenrate eigenvalues and find what $H|a_i \rangle$ gives, where $|a_i \rangle 's$ are the eigenkets of A. Use the fact that $\langle a_j|[H,A]|a_i \rangle = 0$. Then try to see how you can handle degeneracy - is there a way to construct non-degenerate eigenkets that span the basis of B ?

Hurkyl
Staff Emeritus
Gold Member
f you have some wavefunction f, then would <f|H|f> be the energy of the state corresponding to f?

Should be the expected energy, shouldn't it?

Is it also true that:

$$H\left (\exp \left (\frac{-iE_k (t - t_0)}{\hbar }\right )\psi _k\right ) = \exp \left (\frac{-iE_k (t-t_0)}{\hbar }H(\psi _k)$$

Why not? After all, for each t,

$$\exp \left (\frac{-iE_k (t - t_0)}{\hbar }\right )$$

is just a number, and H doesn't involve any time derivatives.

and it's not as though that's a definition for H, right?

Correct. This is not a definition for H, it's the constraint for the time evolution of a system governed by that Hamiltonian.

AHf = aHf, so Hf is an eigenfunction of A corresponding to the same eigenvalue as f.

That tells you an awful lot, doesn't it?

Next, is a Hamiltonian Hermitian if and only if it's eigenfunctions span the Hilbert space?

Operators can have a complete set of eigenvectors without being Hermitian, but Hermitian operators are guaranteed to have a complete set of eigenvectors. In fact, you should be able to form an orthonormal basis out of them. (At least in the finite dimensional case -- I'm less familiar with the infinite dimensional case)

Can we pick the f so that Hf = Ef?

Ef doesn't make sense, does it? f is simply an element of the Hilbert space in this problem, not a function of time!. Anyways, you can only do that if you can come up with a reason why you should be able to do that. P.S. today's secret word is "eigenspace"

Last edited:
Homework Helper
Thanks. Okay, my notes say that if you have a wave function $\psi$, then you can expand the wave function at time t0 in terms of the eigenfunctions:

$$\psi (t_0) = \sum _k C_k (t_0)\psi _k$$

It then says that the wave function at time t is:

$$\psi (t) = \sum _k C_k(t_0)\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k$$

so I wanted to verify this. I should get that H of the right hand side is equal to ih(d/dt) of the right side. I want:

$$H\left [\sum _k C_k(t_0)\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]= i\hbar \frac{\partial }{\partial t}\left [\sum _k C_k(t_0)\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k\right ]$$

$$\sum _k C_k(t_0)H\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]= i\hbar \sum _k C_k(t_0)\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\frac{\partial }{\partial t}\psi _k - \left (\frac{iE_k}{\hbar }\right )\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]$$

$$\sum _k C_k(t_0)H\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]= \sum _k C_k(t_0)\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )H(\psi _k) + \exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )H(\psi _k) \right ]$$

$$\sum _k C_k(t_0)H\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]= 2\sum _k C_k(t_0)H\left [\exp \left (\frac{-iE_k(t-t_0)}{\hbar }\right )\psi _k \right ]$$

which of course can't be right. Where did I make a mistake?

Gokul43201
Staff Emeritus
Gold Member
AKG said:
Can we pick the f so that Hf = Ef?
Not to beat this to death, but you picked f as being an eigenfunction of A. If you also choose it to be an eigenfunction of H, where does that land you ?

Besides, you have not shown that you can choose it this way, and that is in fact, what you are required to prove.

Homework Helper
I already knew that Hf was in the eigenspace of A corresponding to a, but I want to show that a basis for that eigenspace can be chosen such that Hf is a scalar multiple of one of the basis vectors, and not just a linear combination of all the vectors. Maybe it's as simple as saying that if we have the eigenspace corresponding to a, then if the dimension of that subspace is n, then we can certainly choose n eigenfunctions of H that span that subspace. But regardless of which functions we choose, since we're choosing from the eigenspace of A corresponding to a, they will also necessarily be eigenfunctions of A. Don't know how I didn't see that, guess I'm rusty on the linear algebra...

Gokul43201
Staff Emeritus
Gold Member
Is $\psi_k$ time-dependent ?

Hurkyl
Staff Emeritus
Gold Member
Looks like you got the basic idea for #1; Hurray for geometric reasoning! (I would like to add that I think there are still details to work through, but now that you have the basic idea, wrapping up the proof should be easy)

I wouldn't say you're rusty at linear algebra -- you're probably just not used to switching between different modes of thinking... here, your algebraic intuition guided you to a proof that H maps each eigenspace onto itself, but your brain didn't recognize to switch into geometric mode to understand what you had just proven.

Homework Helper
Gokul43201 said:
Is $\psi_k$ time-dependent ?
No. So that does mean that it's partial with respect to t is 0?

Homework Helper
Hurkyl said:
Looks like you got the basic idea for #1; Hurray for geometric reasoning! (I would like to add that I think there are still details to work through, but now that you have the basic idea, wrapping up the proof should be easy)

I wouldn't say you're rusty at linear algebra -- you're probably just not used to switching between different modes of thinking... here, your algebraic intuition guided you to a proof that H maps each eigenspace onto itself, but your brain didn't recognize to switch into geometric mode to understand what you had just proven.
No, it's definitely rustiness with the linear algebra. The problem was that I kind of had something like Gramm-Schmidt in mind, thinking that I could choose a basis for the eigenspace of a and make alterations so that the basis vectors would be eigenvectors of H. My mind was set on the idea that some sneaky choice of vectors had to be chosen to be eigenvectors of H, I wasn't thinking that I could pretty much just pick any. No good reason why I thought that, like I said, just rusty.

Homework Helper
Just to clarify, the Hilbert space is what exactly? It's a vector space of functions over the complex numbers, right? And the functions in the Hilbert space are generally time-dependent?

Gokul43201
Staff Emeritus
Gold Member
AKG said:
No. So that does mean that it's partial with respect to t is 0?
Doesn't it ?

Homework Helper
Wait, but if $\psi _k$ being time-dependent does mean that $\frac{\partial }{\partial t}\psi _k = 0$ does that mean that $\psi _k$ does not satisfy the time-dependent Schrödinger equation? Actually, I guess that makes sense.

Homework Helper
Gokul43201 said:
Doesn't it ?
Yeah, that seems right. I was mislead when Galileo said:

From the S.E above, you know EXACTLY how the state will evolve. Plug it in and solve the differential equation. You'll find that $\psi _k (t)$ will only differ from $\psi _k (t_0)$ in a phase factor which has no physical significance.

Hurkyl
Staff Emeritus
Gold Member
Just to clarify, the Hilbert space is what exactly?

A Hilbert space is a complete inner product space. I bet that was entirely unenlightening. It means that a Hilbert space is a vector space equipped with an inner product.

The inner product defines a norm, and we can use that norm to define Cauchy sequences. "Complete" means that every Cauchy sequence converges.

That's all a Hilbert space is.

For example, any finite-dimensional complex vector space, equipped with any inner product, is a Hilbert space.

Another common example of a Hilbert space is the set of all square-integrable functions over R.

Your wave function ψ(t) is a time-dependent element of this Hilbert space. Or, more precisely, it's some function ψ : RH.

Last edited:
Homework Helper
A function from R to H? What's H? Just to clarify, is $\psi$ the element of the Hilbert space, or is $\psi (t)$. What kind of number (or thing) is $\psi (t)$, that is, the value of $\psi$ at t, not the function $\psi$. To avoid confusion, could we stick to referring to the function $\psi$ simply as $\psi$, and use $\psi(t)$ only when speaking of the value of $\psi[/ite] at t? Thanks. Hurkyl Staff Emeritus Science Advisor Gold Member I was using H to refer to the Hilbert space you're using. A wave function ψ is a map RH. So, for each t0, ψ(t0) is an element of your Hilbert space. For the Hilbert space of square-integrable functions on R, it makes sense to write ψ as a function of two variables, x and t. Then, for each t0, the one-parameter function x → ψ(x, t0) is a member of the Hilbert space. (Assuming it's square-integrable, of course) Actually, maybe the term "wavefunction" is often used both for an element of H and a map R &rarr; H? That would certainly explain the source of confusion. (I was recently confused about this as well) Last edited: Science Advisor Homework Helper So H is a space of function on R3. If f is in H, then (I could be way off here): $$\int _V |f|^2$$ gives the probability that something (what?) is in the region V? Also, what kind of quantity is f(x)? Is |f(x)|² the meaningful quantity, and f(x) more like a mathematical abstraction? Also, my notes say: The dynamics of a system is described in QM by a time evolving wave function f(t) which obeys the SE: Hf(t) = ih(d/dt)f(t) This is confusing. If f is a map from R to H, then f(t), and not f, takes values in H, so the left side could be written H(f(t)). f(t) itself is a function in H and (f(t))(x) is the thing whose square is the probability? On the right hand side, however, if f(t) is either a function of x, or the value itself (the value is the thing whose square is probability) then (d/dt)(f(t)) doesn't make sense, so the right side should be read: ih(df/dt)(t). Science Advisor Homework Helper If anyone would oblige me, I'd like a rigorous mathematical definition of a wave function, and perhaps an unambigious expression of the SE. Also, am I right that the Hilbert spaces in these examples are spaces whose elements are wave functions? And are there significant differences between wave functions and eigenfunctions? Are they both elements of the same Hilbert space? Well, they must be because they are eigenfunctions of the Hamiltonian which acts on the Hilbert space. But then where my book says: $$\psi (t) = \sum _k C_k (t_0)e^{-iE_k(t - t_0)/\hbar }\psi _k$$ it should be written: $$\psi (t) = \sum _k C_k (t_0)e^{-iE_k(t - t_0)/\hbar }\psi _k \mathbf{(t)}$$ but they just omit the (t) at the end since [itex]\psi _k$ is constant with respect to time?

Hurkyl
Staff Emeritus
Gold Member
Okay, if wikipedia is to believed, a wavefunction is simply an element of a Hilbert space H of square-integrable complex-valued functions.

So, your (time-dependent) function &psi; is not a wavefunction per se: it is a function whose values lie in H.

However, like most functions, it is generally kosher to treat it as an object of its range. For example, because we can do things like add vectors and take their dot product, we can also do those things to functions whose values are vectors.

Now, &psi;k is a wavefunction: it is an eigenvector of the Hamiltonian, H, which is a linear operator H &rarr; H. It would be appropriate to call an eigenvector an eigenfunction, because H is a space of functions.

Of course, by the above observation, H can also act on a function R &rarr; H, and so it would also make sense to speak of such things as being eigenfunctions of H.

Now, E (= i h d/dt) is not an operator on H: it only acts on functions R &rarr; H. However, as observed, H also acts on such functions, so it makes sense to state the equation:

E &psi; = H &psi;

where &psi; is a function R &rarr; H.

(P.S. Note that in all of this, I've assumed H is time-independent)

Homework Helper
Hurkyl said:
Now, E (= i h d/dt) is not an operator on H: it only acts on functions RH. However, as observed, H also acts on such functions, so it makes sense to state the equation:

E ψ = H ψ

where ψ is a function RH.
Okay, so ψ is a function RH and ψ(t) is an element of H, which is a function R³ → C. And to avoid confusion, "E ψ = H ψ" means that:

(E(ψ))(t) = H(ψ(t))

Hurkyl
Staff Emeritus
Gold Member
And to avoid confusion, "E ψ = H ψ" means that:

(E(ψ))(t) = H(ψ(t))

Yes, but one of the things I was trying to note is that we really can define an action of H on a function &psi; : R &rarr; H by:

(H&psi;)(t) := H(&psi;(t))

Gokul43201
Staff Emeritus
Gold Member
AKG said:
So H is a space of function on R3. If f is in H, then (I could be way off here):

$$\int _V |f|^2$$

gives the probability that something (what?) is in the region V? Also, what kind of quantity is f(x)? Is |f(x)|² the meaningful quantity, and f(x) more like a mathematical abstraction?
f(x) or $\psi (x)$, is what is commonly called the wavefunction - it is nothing but the inner product : $\langle x| \alpha \rangle$ ,where $| \alpha \rangle$, an arbitrary state ket of some particle, can be expanded in terms of the base kets $| x \rangle$ (the eigenkets of the position operator) of an infinite dimensional Hilbert space.

From closure:
$$\int _{- \infty} ^ {\infty} dx ~|x \rangle \langle x| = 1 ~~--(1)$$

So, we can expand $| \alpha \rangle$ as :

$$| \alpha \rangle = \int _{- \infty} ^ {\infty} dx~ |x \rangle \langle x|\alpha \rangle \equiv \int _{- \infty} ^ {\infty} dx~ \psi_{\alpha} (x)~|x \rangle ~~--(2)$$

This $\psi_{\alpha} (x)$, by construction, is nothing but an expansion coefficient (just like the discrete expansion coefficients that you come across if you expand a state ket in a discrete basis).

Also, if $| \alpha \rangle$ is normalized, then

$$\langle \alpha | \alpha \rangle = \int _{- \infty} ^ {\infty} dx ~\langle \alpha |x \rangle \langle x|\alpha \rangle = \int _{- \infty} ^ {\infty} dx ~\psi_{\alpha} (x) ~ \psi^*_{\alpha} (x) = 1 ~~--(3)$$

Here, $\langle \alpha |x \rangle \langle x|\alpha \rangle = | \langle x|\alpha \rangle|^2 \equiv | \psi_{\alpha} (x) |^2$ is the probability density function, and $| \psi_{\alpha} (x') |^2~dx'$ is the probability of finding the particle in an interval $dx'$ centered on $x'$.

Last edited:
Gokul43201
Staff Emeritus