- #1

- 292

- 3

If we have an orthonormal basis, how can we show that the relation

[tex]\sum|x><x|[/tex] = Identity?

I see this in Quantum Mechanics but I'm not sure how to prove it. Thank you.

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter McLaren Rulez
- Start date

- #1

- 292

- 3

If we have an orthonormal basis, how can we show that the relation

[tex]\sum|x><x|[/tex] = Identity?

I see this in Quantum Mechanics but I'm not sure how to prove it. Thank you.

- #2

Fredrik

Staff Emeritus

Science Advisor

Gold Member

- 10,872

- 416

This is pretty difficult, so I will only tell you what the steps are. It would take much too long to write out the full proof. In all of these definitions and theorems, H is a Hilbert space. I will assume that H is such that there exists a countable basis. (This assumption is standard when we're dealing with single-particle quantum theories, but I've been told that it's too restrictive for quantum field theory). I'm using the convention that the inner product is linear in the second variable.

**Definition:** An orthonormal basis of H is an orthonormal set that's not a proper subset of any other orthonormal set.

**Theorem:** Suppose that K is a closed convex subset of H. For each x in H, there's a unique x_{0} in K that's at the minimum distance from x. (In other words, this x_{0} satisfies d(x,x_{0})=d(x,K)).

**Theorem:** Suppose that M is a closed linear subspace of H. For each x in H, the following conditions on x_{0} in M are equivalent:

(a) x_{0} is the unique vector at the minimum distance from x.

(b) x-x_{0} is orthogonal to M.

**Definition:** The map [itex]x\mapsto x_0[/itex] is called the orthogonal projection onto the closed linear subspace M. Orthogonal projections are also called projection operators.

**Theorem:** If E={e_{1},...,e_{n}} is an orthonormal set, and P is the projection operator for the linear subspace spanned by the members of E, then for all x in H,

[tex]Px=\sum_{k=1}^n\langle e_k,x\rangle e_k.[/tex]

(This is proved by showing that x minus the sum on the right is orthogonal to the subspace, and then appealing to the previous theorem).

**Theorem:** If E={e_{1},e_{2},...} is an orthonormal set, then for all x in H,

[tex]\sum_{k=1}^\infty|\langle e_k,x\rangle|^2\leq\|x\|^2.[/tex]

(The inequality above is called Bessel's inequality).

**Theorem:** For each x in H, the sequence of partial sums of the series [itex]\sum_{k=1}^\infty \langle e_k,x\rangle e_k[/itex] is a Cauchy sequence. (By definition of "Hilbert space", this means that the series is convergent).

**Theorem:** If E={e_{1},e_{2},...} is an orthonormal basis, then for each x in H, [itex]x-\sum_{k=1}^\infty \langle e_k,x\rangle e_k[/itex] is orthogonal to E (and therefore =0).

You will need to use other results along the way, like the Pythagorean theorem for Hilbert spaces, and this theorem about series whose terms are real numbers:

**Theorem:** If [itex]\sum_{k=1}^\infty a_k[/itex] is a convergent series in [itex]\mathbb R[/itex], then [itex]\lim_m\sum_{k=m}^\infty a_k=0[/itex].

This stuff is covered pretty well in Conway, but I don't recommend the rest of the book. It's ridiculously hard to read. Kreyszig would be a much better choice. (That's what people are telling me. I haven't read it myself).

(a) x

(b) x-x

[tex]Px=\sum_{k=1}^n\langle e_k,x\rangle e_k.[/tex]

(This is proved by showing that x minus the sum on the right is orthogonal to the subspace, and then appealing to the previous theorem).

[tex]\sum_{k=1}^\infty|\langle e_k,x\rangle|^2\leq\|x\|^2.[/tex]

(The inequality above is called Bessel's inequality).

You will need to use other results along the way, like the Pythagorean theorem for Hilbert spaces, and this theorem about series whose terms are real numbers:

This stuff is covered pretty well in Conway, but I don't recommend the rest of the book. It's ridiculously hard to read. Kreyszig would be a much better choice. (That's what people are telling me. I haven't read it myself).

Last edited:

- #3

Fredrik

Staff Emeritus

Science Advisor

Gold Member

- 10,872

- 416

I suspect that most people who think they want to know the answer to the question you asked will decide that they really don't when they see my reply above. Most people will settle for the corresponding theorem for finite-dimensional Hilbert spaces, which is much easier to prove. Let x be an arbitrary member of H. If {e_{1},...,e_{n}} is a basis, then there exist complex numbers {a_{1},...,a_{n}} such that

[tex]x=\sum_{k=1}^n a_k e_k.[/tex]

This implies

[tex]\langle e_i,x\rangle=\sum_{k=1}^n a_k \langle e_i,e_k\rangle=a_i.[/tex]

[tex]x=\sum_{k=1}^n a_k e_k.[/tex]

This implies

[tex]\langle e_i,x\rangle=\sum_{k=1}^n a_k \langle e_i,e_k\rangle=a_i.[/tex]

Last edited:

- #4

- 313

- 1

Having a basis |e

|x> = ∑

If the basis is orthonormal, then taking the innerproduct of the above is easy (assuming no convergence issues)

<e

Substituting back into the first equation gives

|x> = ∑

Since this is true for all |x> it must be that

∑

is the unique identity for the Hilbert space.

- #5

- 292

- 3

Yes, I think I probably can't handle the first proof for the infinite dimensional case since I've just started on QM. But thank you for writing it out. Hopefully, I'll come back to it after a while and figure it out.

Share: