# What does "completeness" mean in completeness relations

• I
• Frank Castle

#### Frank Castle

From my humble (physicist) mathematics training, I have a vague notion of what a Hilbert space actually is mathematically, i.e. an inner product space that is complete, with completeness in this sense heuristically meaning that all possible sequences of elements within this space have a well-defined limit that is itself an element of this space (I think this is right?!). This is a useful property as it enables one to do calculus in this space.

Now, in quantum mechanics Hilbert spaces play an important role in that they are the spaces in which the (pure) states of quantum mechanical systems "live". Given a set of orthonormal basis vectors, ##\lbrace\lvert\phi_{n}\rangle\rbrace## for such a Hilbert space, one can express a given state vector, ##\lvert\psi\rangle## as a linear combination of these basis states, $$\lvert\psi\rangle=\sum_{n}c_{n}\lvert\phi_{n}\rangle$$ since the basis states are orthonormal, i.e. ##\langle\phi_{n}\lvert\phi_{m}\rangle =\delta_{nm}## we find that ##c_{n}=\langle\phi_{n}\lvert\psi\rangle##, and hence $$\lvert\psi\rangle=\sum_{n}c_{n}\lvert\phi_{n}\rangle =\sum_{n}\langle\phi_{n}\lvert\psi\rangle\lvert\phi_{n}\rangle =\left(\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\lvert\right)\lvert\psi\rangle$$ which implies that $$\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\lvert =\mathbf{1}$$ This is referred to as a completeness relation, but I'm unsure what this is referring to? I've also read that the basis must be complete. Is this referring to the notion of completeness associated with limits of sequences, or is there something else I'm missing? And also, apart from being implied in the "derivation" I did above, why does completeness of a given basis require that the sum of outer products of each basis vector with itself equals the identity?

In a Hilbert space, a set of states $|\phi_n\rangle$ is said to be "complete" if every state can be written as a linear combination of those. The expression $\sum_n |\phi_n \rangle \langle \phi_n| = 1$ concisely expresses the fact that the set $|\phi_n\rangle$ is complete and orthonormal.

In a Hilbert space, a set of states |ϕn⟩|\phi_n\rangle is said to be "complete" if every state can be written as a linear combination of those. The expression ∑n|ϕn⟩⟨ϕn|=1\sum_n |\phi_n \rangle \langle \phi_n| = 1 concisely expresses the fact that the set |ϕn⟩|\phi_n\rangle is complete and orthonormal.

Does this notion of completeness have anything to do with the notion of completeness in terms of Cauchy sequences? i.e. Is the basis complete in the sense that a given Cauchy sequence of basis vectors converges to a given vector in the Hilbert space?

Sorry if I'm being stupid here, but I'm still unsure as to why the outer product of the basis vectors expresses completeness? Is it that any operator acting on states in the Hilbert space can be represented in terms of outer products of this basis, and in particular the identity operator can be represented by the basis such that $$\mathbf{1}=\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\rvert$$ And then, given that $$\mathbf{1}\lvert\psi\rangle =\lvert\psi\rangle$$ for any state vector in the Hilbert space, the fact that ##\mathbf{1}## can be represented as an outer product of this basis means that it is complete (since the identity operator acts on all state vectors in the Hilbert space), and its action on an arbitrary vector projects it onto the given basis?!

Does this notion of completeness have anything to do with the notion of completeness in terms of Cauchy sequences? i.e. Is the basis complete in the sense that a given Cauchy sequence of basis vectors converges to a given vector in the Hilbert space?

I'm pretty sure it is the case that Hilbert space is complete in the sense of Cauchy sequences, but that seems unrelated to the question of completeness of a set of basis states. When people call $\sum_n |\phi_n\rangle \langle \phi_n | = 1$ a "completeness relation", I don't think it has anything directly to do with Cauchy sequences.

Sorry if I'm being stupid here, but I'm still unsure as to why the outer product of the basis vectors expresses completeness?

Maybe it would help to consider some really simple cases. For example, the Hilbert space of 2-component column matrices.

Any two-component column matrix $\Psi = \left( \begin{array}\\ \alpha \\ \beta \end{array} \right)$ can be expressed as a linear combination of two basis elements: $U = \left( \begin{array}\\ 1 \\ 0 \end{array} \right)$ and $D = \left( \begin{array}\\ 0 \\ 1 \end{array} \right)$. You can write:

$\Psi = \alpha U + \beta D$

And it's also true that $\alpha = U^\dagger \Psi$ and $\beta = D^\dagger \Psi$. So for any two-component column matrix $\Psi$, we can write:

$\Psi = U (U^\dagger \Psi) + D (D^\dagger \Psi) = [U U^\dagger + D D^\dagger] \Psi$

Instead of $U$ and $D$, the same is true for the pair of states:

$U' = \left( \begin{array}\\ \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{array} \right)$ and $D' = \left( \begin{array}\\ \frac{1}{\sqrt{2}} \\ \frac{-1}{\sqrt{2}} \end{array} \right)$. For these two matrices, it is also true that:

$\Psi = U' (U'^\dagger \Psi) + D' (D'^\dagger \Psi) = [U' U'^\dagger + D' D'^\dagger] \Psi$

For any two matrices $A$ and $B$, they are complete and orthonormal if and only if $A A^\dagger + B B^\dagger = I$ (where $I$ is the two-by-two unit matrix).

Is it that any operator acting on states in the Hilbert space can be represented in terms of outer products of this basis

That is true: Any nicely-behaved operator (I'm not sure what the restrictions are) $\hat{Q}$ can be written as:

$\sum_{n m} Q_{nm} |\psi_n\rangle \langle \psi_m|$

where $|\psi_n\rangle$ is a complete, orthonormal set of states, and where $Q_{nm} = \langle \psi_n|\hat{Q}|\psi_m\rangle$

and in particular the identity operator can be represented by the basis such that $$\mathbf{1}=\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\rvert$$ And then, given that $$\mathbf{1}\lvert\psi\rangle =\lvert\psi\rangle$$ for any state vector in the Hilbert space, the fact that ##\mathbf{1}## can be represented as an outer product of this basis means that it is complete (since the identity operator acts on all state vectors in the Hilbert space), and its action on an arbitrary vector projects it onto the given basis?!

That's all true, but I think that the most basic concept is a complete, orthonormal set of states. $\sum_n |\psi_n\rangle \langle \psi_n | = 1$ implies that the set $|\psi_n\rangle$ is complete and orthonormal.

• Demystifier
For any two matrices AA and BB, they are complete and orthonormal if and only if AA†+BB†=IA A^\dagger + B B^\dagger = I (where II is the two-by-two unit matrix).

Is it possible to generalise this? I'm probably just being stubborn minded, but I can see why this would be true in this specific example, but I can't seem to get my head around the conceptual meaning of it?!

Is the method of analysis conceptual something like this?

One assumes that ##\lbrace\lvert\phi_{n}\rangle\rbrace## is a complete, orthonormal basis for the Hilbert space. If this is true, then we can express an arbitrary vector ##\lvert\psi\rangle## in this Hilbert space as a (unique) linear combination of this basis, i.e. $$\lvert\psi\rangle =\sum_{n}c_{n}\lvert\phi_{n}\rangle$$ Using the inner-product and the fact that the vectors ##\lbrace\lvert\phi_{n}\rangle\rbrace## are said to be orthonormal, i.e. ##\langle\phi_{n}\vert\phi_{m}\rangle=\delta_{nm}##, we have that $$\langle\phi_{n}\vert\psi\rangle =\langle\phi_{n}\vert\left(\sum_{m}c_{m}\lvert\phi_{m}\rangle\right)=\sum_{m}c_{m}\langle\phi_{n}\vert\phi_{m}\rangle =c_{n}$$ It follows from this that we can write $$\lvert\psi\rangle =\sum_{n}c_{n}\lvert\phi_{n}\rangle =\sum_{n}\left(\langle\phi_{n}\vert\psi\rangle\right)\lvert\phi_{n}\rangle = \left(\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\rvert\right)\lvert\psi\rangle$$ This therefore implies that ##\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\rvert =\mathbf{1}##.
Hence, by assuming that ##\lbrace\lvert\phi_{n}\rangle\rbrace## is a complete, orthonormal basis for the Hilbert space, we find that in order for this to be true, it must be that the set of vectors ##\lbrace\lvert\phi_{n}\rangle\rbrace## satisfies the property ##\sum_{n}\lvert\phi_{n}\rangle\langle\phi_{n}\rvert =\mathbf{1}##.

Also, slightly off-topic, but what is the qualitative difference between an outer product and a tensor product? My understanding is that an outer product is just a particular case of a tensor product acting on two vectors. Is it simply a matter of implicit definition? By this I mean that an outer product clearly maps two vectors in a given Hilbert space to an operator acting on that Hilbert space, for example $$(\lvert\psi\rangle ,\lvert\phi\rangle )\mapsto\hat{A}=\lvert\psi\rangle\langle\phi\rvert$$ Whereas, a tensor product of two vectors, each residing in its own Hilbert space (i.e. they reside in different hilbert space), maps two vectors to a vector in the tensor product of the two Hilbert spaces. For example, $$(\lvert\psi\rangle ,\lvert\phi\rangle )\mapsto\lvert\psi,\phi\rangle =\lvert\psi\rangle \otimes\lvert\phi\rangle$$

Is it possible to generalise this? I'm probably just being stubborn minded, but I can see why this would be true in this specific example, but I can't seem to get my head around the conceptual meaning of it?!

Is the method of analysis conceptual something like this?
[stuff deleted]

That's all correct. I'm not sure if that explains what "completeness" means, but it shows that it certainly doesn't have any (direct) connection with Cauchy sequences.

Also, slightly off-topic, but what is the qualitative difference between an outer product and a tensor product? My understanding is that an outer product is just a particular case of a tensor product acting on two vectors. Is it simply a matter of implicit definition? By this I mean that an outer product clearly maps two vectors in a given Hilbert space to an operator acting on that Hilbert space, for example $$(\lvert\psi\rangle ,\lvert\phi\rangle )\mapsto\hat{A}=\lvert\psi\rangle\langle\phi\rvert$$ Whereas, a tensor product of two vectors, each residing in its own Hilbert space (i.e. they reside in different hilbert space), maps two vectors to a vector in the tensor product of the two Hilbert spaces. For example, $$(\lvert\psi\rangle ,\lvert\phi\rangle )\mapsto\lvert\psi,\phi\rangle =\lvert\psi\rangle \otimes\lvert\phi\rangle$$

I think for most purposes, tensor product and outer product mean the same thing. But $|\psi\rangle \langle \phi|$ isn't the tensor product of $|\psi\rangle$ and $|\phi\rangle$, it is the tensor product of $|\psi\rangle$ and $(|\phi \rangle)^\dagger$

I'm not sure if that explains what "completeness" means

Is there a way to give a qualitative description of completeness that matches up to this mathematical statement. Completeness means that the basis spans the entire vector space such that every vector in the vector space can be expressed as a linear combination of this basis. Can't one imply that from the approach that I took, since one assumes completeness and then derives the condition that must be satisfied in order for the basis to be complete?

Another thing that confuses me is that I was under the impression that by definition a basis spans the vector space (at least this is true in linear algebra), is this not always true (hence why we need the completeness relation)?

I think for most purposes, tensor product and outer product mean the same thing.

Ah ok. Are there any case in which they differ? Is it simply a matter of context as to whether the tensor product is a mapping from two Hilbert spaces to a product Hilbert space, or whether it is a mapping of a Hilbert space to the set of linear operators acting on that Hilbert space?

But |ψ⟩⟨ϕ||\psi\rangle \langle \phi| isn't the tensor product of |ψ⟩|\psi\rangle and |ϕ⟩|\phi\rangle, it is the tensor product of |ψ⟩|\psi\rangle and (|ϕ⟩)†

Aren't ##\mathcal{H}\otimes\mathcal{H}^{\ast}## and ##\mathcal{H}\otimes\mathcal{H}## isomorphic though?

Is there a way to give a qualitative description of completeness that matches up to this mathematical statement. Completeness means that the basis spans the entire vector space such that every vector in the vector space can be expressed as a linear combination of this basis. Can't one imply that from the approach that I took, since one assumes completeness and then derives the condition that must be satisfied in order for the basis to be complete?

I think those are equivalent. The two assumptions that (1) every $|psi\rangle$ can be written in the form $\sum_n C_n |\phi_n \rangle$, and (2) \langle \phi_n | \phi_m \rangle = \delta_{nm}[/itex] are exactly equivalent to the assumption $\sum_n |\phi_n\rangle \langle \phi_n | = 1$.

Another thing that confuses me is that I was under the impression that by definition a basis spans the vector space (at least this is true in linear algebra), is this not always true (hence why we need the completeness relation)?

The "completeness relation" does imply that the set spans the vector space.

Ah ok. Are there any case in which they differ? Is it simply a matter of context as to whether the tensor product is a mapping from two Hilbert spaces to a product Hilbert space, or whether it is a mapping of a Hilbert space to the set of linear operators acting on that Hilbert space?

I don't know of where "outer product" and "tensor product" are used differently.

Aren't ##\mathcal{H}\otimes\mathcal{H}^{\ast}## and ##\mathcal{H}\otimes\mathcal{H}## isomorphic though?

Yes, they are. But part of the definition of a tensor product is bilinearity: $(\alpha A) \otimes (\beta B) = \alpha \beta (A \otimes B)$. But for the binary operator

$F(|\psi\rangle, |\phi \rangle) = |\psi \rangle \langle \phi |$

it is not linear in the second argument:

$F(|\psi\rangle, |\alpha \phi \rangle) = \alpha^* |\psi \rangle \langle \phi |$

The two assumptions that (1) every |psi⟩|psi⟩|psi\rangle can be written in the form ∑nCn|ϕn⟩∑nCn|ϕn⟩\sum_n C_n |\phi_n \rangle, and (2) \langle \phi_n | \phi_m \rangle = \delta_{nm}[/itex]

I don't know why, but I understand this approach more intuitive and then deriving the equivalent condition from this.

The "completeness relation" does imply that the set spans the vector space.

What I meant by this was that usually the assumption is implicit, i.e. one does not usually even define a completeness relation in standard linear algebra (at least as far as I'm aware), it is just taken that if a set of vectors form a basis then they span the vector space.

What I meant by this was that usually the assumption is implicit, i.e. one does not usually even define a completeness relation in standard linear algebra (at least as far as I'm aware), it is just taken that if a set of vectors form a basis then they span the vector space.

But "forming an orthonormal basis" and "obeying the completeness relation" are the same thing. The latter is a mathematically concise way to say the former.

But "forming an orthonormal basis" and "obeying the completeness relation" are the same thing. The latter is a mathematically concise way to say the former.

Can one formulate it as an if and only if statement, i.e. a set of vectors is orthonormal and complete (and thus forms a basis for the vector space) if and only if it satisfies the completeness relation?

In a Hilbert space, a set of states $|\phi_n\rangle$ is said to be "complete" if every state can be written as a linear combination of those. The expression $\sum_n |\phi_n \rangle \langle \phi_n| = 1$ concisely expresses the fact that the set $|\phi_n\rangle$ is complete and orthonormal.
It doesn't say by itself that the vectors are orthonormal. This you have to assume in addition, i.e., you have a complete set of orthonormalized vectors in a Hilbert space, if and only if ##\langle \phi_j|\phi_k \rangle=\delta_{jk}## and ##\sum_n |\phi_n \rangle \langle \phi_n|=\hat{1}##.

• bhobba
A set of vectors may obey the completeness relation (the projection operators sum to the identity) even if they are not orthonormal. The set of coherent states is an example because of a group-theoretical reason. Edit: Caveat: it spans the Hilbert space but it is not a basis since the coherent states are not linearly independent.

Briefly, if ##|0 \rangle## denotes some state and one has a representation ##U(g)## of a group ##G## and call the set of states ##|g \rangle = U(g)|0\rangle## coherent states, then I think whenever one has a suitable measure ##dg## to integrate over (mathematically sophisticated folks can correct me on this), $$V := \int dg |g\rangle \langle g| = \int dg U(g)|0\rangle \langle 0| U^{\dagger}(g)$$ commutes with ##U(g)## for every ##g##. If ##U(g)## is an irreducible representation, Schur's lemma implies that ##V## is proportional to the identity. All of this will work for a finite group too. I suspect one can easily find some symmetric POVM elements as examples.

As an aside, this makes me think of the other direction. Does being a basis implies the completeness relation? For orthogonal bases, yes but not for the above group-theoretical reason. For non-orthogonal bases, no.

Last edited:
• bhobba and vanhees71
I looked around a bit more (on Wikipedia and http://www.jelena.ece.cmu.edu/repository/journals/07_SPMag_KovacevicCa.pdf [Broken]) and found that tight frames might be what you need if you want to prove some if-and-only-if statement about the completeness relation.

A set of vectors ##|\phi_n\rangle## is a frame of a vector space if and only if it satisfies the condition $$a u^2 \le \sum_n | \langle \phi_n |u\rangle |^2 \le bu^2$$ for any vector ##|u\rangle## with norm ##u##, where ##a## and ##b## are non-zero positive numbers. This implies that the frame spans the vector space, because if it does not, we can choose the vector outside of the span and make the sum zero, contradicting the assumption that ##a## is positive and non-zero. And that's it. A frame doesn't have to be orthogonal vectors or a basis. But to reconstruct a vector from the expansion coefficients ##\langle \phi_n|u\rangle## in general you will need a different frame ##| \tilde{\phi}_n \rangle## called a dual frame such that $$|u\rangle = \sum_n \langle \phi_n |u\rangle |\tilde{\phi}_n \rangle$$. A frame is tight when ##a=b##. Rewriting the condition $$\sum_n |\langle \phi_n |u\rangle |^2 = \langle u| \sum_n |\phi_n \rangle \langle \phi_n |u\rangle = au^2$$ shows that $$\sum_n |\phi_n \rangle \langle \phi_n| = a \hat{1}$$ and that the frame is dual to itself like an orthogonal basis.

Last edited by a moderator:
• vanhees71