# The ensemble interpretation of QM

The ensemble interpretation of QM essentially ignores the wave function collapse, and apparently decoherence as well, by stating that only the statistical distributions within ensembles of systems matter (each system being in just one state).

My questions are:

Does the preparation of macroscopic superimposed states (cats) in Bose-Einstein condensates (BEC) effectively negate this interpretation?

As I understand it, the macroscopic superposition of states in a BEC is a single (usually described as a two state) macroscopic quantum system although it may consist of tens of thousands of atoms. Is this correct?

Last edited:

Related Quantum Interpretations and Foundations News on Phys.org
f95toli
Gold Member
I don't think that intepretation makes any sense any more; although I guess it was valid in the early days of QM when most experiments were indeed performed on ensembles.
If I am not mistaken it was the "interpretation" that Schrodinger believed in (as least there is a famous quote where he seems to support the idea).

There are plenty of experiments where we are operating on a single "entity"; not only in the sense that there is a single wavefunction (like in a condensate) but quite literally. An obvious example would be a single ion in an ion-trap; I don't quite see how ensembles enters into a situation like that.
Other examples include many qubit systems where there really is a SINGLE system that can be in either of two states (solid state systems such as Josephson junctions being an extreme example).

I don't think that intepretation makes any sense any more; although I guess it was valid in the early days of QM when most experiments were indeed performed on ensembles.
If I am not mistaken it was the "interpretation" that Schrodinger believed in (as least there is a famous quote where he seems to support the idea).

There are plenty of experiments where we are operating on a single "entity"; not only in the sense that there is a single wavefunction (like in a condensate) but quite literally. An obvious example would be a single ion in an ion-trap; I don't quite see how ensembles enters into a situation like that.
Other examples include many qubit systems where there really is a SINGLE system that can be in either of two states (solid state systems such as Josephson junctions being an extreme example).
Thanks f95toli. I wasn't sure if the Ensemble Interpretation had been falsified on other ways besides in BECs. At least this was an interpretation that could be falsified. Einstein liked it too.

strangerep
The ensemble interpretation of QM [...]
In modern times, it's more appropriately named the "statistical interpretation".
The BEC state you mentioned is then not really different (not fundamentally anyway)
from some other state. One describes prepared systems as mixed states (possibly
constructed via tensor products of simpler systems).

For the modern interpretation, I found Ballentine's textbook
"Quantum Mechanics -- A Modern Development" (2008 edition or later)
quite excellent. It is truly enlightening to see how the statistical interpretation
in its full generality addresses all sorts of experiments so well, without all
the nonsense that too often plagues QM.

HTH.

In modern times, it's more appropriately named the "statistical interpretation".
The BEC state you mentioned is then not really different (not fundamentally anyway)
from some other state. One describes prepared systems as mixed states (possibly
constructed via tensor products of simpler systems).

For the modern interpretation, I found Ballentine's textbook
"Quantum Mechanics -- A Modern Development" (2008 edition or later)
quite excellent. It is truly enlightening to see how the statistical interpretation
in its full generality addresses all sorts of experiments so well, without all
the nonsense that too often plagues QM.

HTH.
Interesting. I'll try to find it when I can. (I'm spending the summer in a somewhat remote location.) However, if I understand it, the statistical interpretation assumes each particle has just one state, and that the what we call a superposition is really just an ensemble of distinct particles. This seems quite a stretch given the experimental evidence for superpositions not to mention how fundamental the concept is to QM.

JK423
Gold Member
The ensemble interpretation of QM essentially ignores the wave function collapse, and apparently decoherence as well, by stating that only the statistical distributions within ensembles of systems matter (each system being in just one state).
The ensemble interpretation doesnt ignore the wavefunction collapse as far as i know.
For example, you can have an ensemble where each of the particles would be in superposition. eg They would be described by the state vector: |A>=a|1>+b|2>.
When make a measurement to try to find out in which state the particle is, the |A> collapses to |1> or |2>.
However you can have another ensemble where the particles can be in only one state, either1> or |2> with the above probabilities.
However these two systems are not equivalent.

Fredrik
Staff Emeritus
Gold Member
Does the preparation of macroscopic superimposed states (cats) in Bose-Einstein condensates (BEC) effectively negate this interpretation?
Not at all. Why would it?

I assume that the "interpretation" you're asking about is the one that says that quantum mechanics is just an algorithm that tells us how to compute the probabilities of possible results of experiments given the result of other experiments. I don't think of this as an interpretation. I prefer to think of it as the realization that QM doesn't need an interpretation.

Everyone agrees that QM is such an algorithm, but a lot of people (most of them in fact) also believe that QM must be a description of the way the universe works. They are the ones who keep trying to interpret QM. All the interpretations are attempts to tell us what exactly QM is supposed to be describing.

So what does any of this have to do with ensembles? I'll tell you. If QM is "just an algorithm", then state vectors don't represent objective properties of physical systems, but they do represent objective (statistical) properties of ensembles of identically prepared systems. Even if you send one particle at a time through your measurement device, you're going to have to deal with an ensemble. You have to repeat the experiment a large number of times to test a prediction, and the particles that participate in all these different experiments can (and should) be thought of as an ensemble.

For the modern interpretation, I found Ballentine's textbook
"Quantum Mechanics -- A Modern Development" (2008 edition or later)
quite excellent.
I did too. I also have to recommend "Lectures on quantum theory: mathematical and structural foundations" by Chris Isham (I seem to be doing that in 25% of my posts now ).

(It's probably too basic for you (Strangerep), but it's perfect for an undergraduate who has taken a QM class or two. It's a much easier read than Ballentine).

Interesting. I'll try to find it when I can. (I'm spending the summer in a somewhat remote location.)
You can read most of Ballentine's book at Google Books. Isham's book is there too, but they have made it difficult to access as many pages as you'd like.

However, if I understand it, the statistical interpretation assumes each particle has just one state, and that the what we call a superposition is really just an ensemble of distinct particles. This seems quite a stretch given the experimental evidence for superpositions not to mention how fundamental the concept is to QM.
This is a misunderstanding on your part. A very big one actually (but that's OK ).

Hm, now I'm starting to think that the interpretation you had in mind is actually one of the hidden-variable interpretations that have been completely crushed by Bell inequalities.

Last edited:
George Jones
Staff Emeritus
Gold Member
I'd also have to recommend "Lectures on quantum theory: mathematical and structural foundations" by Chris Isham (I seem to be doing that in 25% of my posts now ).
I, too, love this book.

Demystifier
Gold Member
I don't think that intepretation makes any sense any more; although I guess it was valid in the early days of QM when most experiments were indeed performed on ensembles.
If I am not mistaken it was the "interpretation" that Schrodinger believed in (as least there is a famous quote where he seems to support the idea).

There are plenty of experiments where we are operating on a single "entity"; not only in the sense that there is a single wavefunction (like in a condensate) but quite literally. An obvious example would be a single ion in an ion-trap; I don't quite see how ensembles enters into a situation like that.
Other examples include many qubit systems where there really is a SINGLE system that can be in either of two states (solid state systems such as Josephson junctions being an extreme example).
I think you misunderstood the notion of an ensemble. For example, even in experiments with a single ion in an ion trap, one repeats such an experiment many times and each time one obtains a different result (depending on what exactly is measured). All these results from many independent experiments represent one ensemble. The distribution of these results is predicted by QM. QM predicts nothing about individual experiments (except when the state is an eigenstate of the measured observable), but only about the whole ensemble of many experiments.

This is a misunderstanding on your part. A very big one actually (but that's OK ).

Hm, now I'm starting to think that the interpretation you had in mind is actually one of the hidden-variable interpretations that have been completely crushed by Bell inequalities.
No, I'm thinking about the statistical (ensemble) interpretation (SI), but you're correct in that I don't understand it (although these responses have been helpful). However, my impression was that SI avoided the weirdness of the wave function collapse and parallel universes. Are you saying that SI preserves superposition in a one particle system; or, as you say and I said originally in post 1, it simply ignores quantum reality and deals only with objective results. In that case it's the same as "shut up and calculate", is it not?

Last edited:
I think the issue of whether quantum probabilities are only manifested in ensembles, or whether they are somehow present in each isolated event, have more to do with interpretations of probability than they do with interpretations of QM. Maybe this article will help:

http://plato.stanford.edu/entries/probability-interpret/#ProInt

Fredrik
Staff Emeritus
Gold Member
However, my impression was that SI avoided the weirdness of the wave function collapse and parallel universes. Are you saying that SI preserves superposition in a one particle system; or, as you say and I said originally in post 1, it simply ignores quantum reality and deals only with objective results. In that case it's the same as "shut up and calculate", is it not?
When you calculate the probabilities of the possible results of a specific experiment, you will still have to take the state before the measurement to be a superposition, if the system has been prepared that way. So you're not getting rid of superpositions.

I still think of a measurement as an interaction that, through the process of decoherence, entangles the eigenstates of some operator with macroscopically distinguishable states of a system that for practical purposes can be considered classical. So we're not getting rid of decoherence either. I don't think decoherence has solved the "measurement problem", but it has given us the best definition we have of what a measurement is.

Isham had some very interesting comments about wavefunction collapse in his book. Suppose a silver atom prepared in the spin state |s>=|+>+|-> is sent through a Stern-Gerlach apparatus without any kind of detector to tell us where the atom has ended up. The interaction with the apparatus entangles the spin states with position eigenstates, so that the time evolution of the |position>|spin> state is

|center>|s> → |left>|+>+|right>|->

What Isham pointed out is that if we now put a measuring device (e.g. another Stern-Gerlach apparatus with a different orientation, and with particle detectors at the positions of its outgoing beams) in the position where the right "beam" comes out, we're going to have to take the state before the measurement to be |right>|->. We still haven't measured anything, but we have "collapsed" the wavefunction simply by putting the measuring device on the right. This illustrates the importance of distinguishing between state preparation and measurement, and that the "collapse" isn't a physical process at all (in this "interpretation"). It's just a selection of what to measure.

The difference between this and "shut up and calculate" is mainly a difference of attitude. They seem to be thinking a) "we don't need to know what the correct interpretation is to be able to use QM", and b) "finding a good interpretation is difficult and takes too much time". We're saying that a) "we have realized that QM doesn't need an interpretation" (which is a fairly deep philosophical insight), and b) "we don't think QM can be interpreted". (We have arguments for this, but no proof at this time). At least that's what I'm saying. I don't really know if Ballentine and Isham would agree with everything I'm saying.

The phrase "shut up and calculate" is supposed to be an answer to questions like "What does quantum mechanics really describe?". It's an answer you might give if you think the question is stupid, or not at all interesting. I don't think it's a bad question. I think it's based on some misconceptions, but you won't realize that until you've thought about it, and that question is a good starting point.

This "interpretation" doesn't rule out many worlds. It's saying that a state vector represents objective properties of an ensemble of identically prepared systems, and it wouldn't be crazy to think of the ensemble as consisting of many identical systems in different worlds. This touches on what Civilized said. Now we're actually talking about interpretations of probability.

Last edited:
This "interpretation" doesn't rule out many worlds. It's saying that a state vector represents objective properties of an ensemble of identically prepared systems, and it wouldn't be crazy to think of the ensemble as consisting of many identical systems in different worlds. This touches on what Civilized said. Now we're actually talking about interpretations of probability.
Many thanks Frederick for this comprehensive reply and for all the other replies. I've downloaded chapt 2 of Ballentine and chapt 5 of Isham as well as the link provided by Civilized. I hadn't heard about the Ensemble Interpretation until reading some background material regarding decohorence in BECs recently. I'm glad to hear its still respectable since I never liked metaphysical constructs to explain science. It may well be impossible for us to understand quantum reality as such. I have another topic up where I ask about the potential for BECs to provide a medium for actually (in some sense) observing superpositions and the decoherence process.

Last edited:
strangerep
[...] if I understand it, the statistical interpretation assumes each particle has just one state, and that the what we call a superposition is really just an ensemble of distinct particles.
A single copy of a system is not necessarily associated
with a pure state. The state operator -- (I say "operator"
since this concept encompasses mixed states as
well and is thus more general) -- is more like the quantum
version of a probability distribution. I now find it more
helpful to think in terms of such (mixed) state operators
rather than superpositions.

strangerep
I also have to recommend "Lectures on quantum theory: mathematical and structural
foundations" by Chris Isham (I seem to be doing that in 25% of my posts now ).

(It's probably too basic for you (Strangerep), but it's perfect for an undergraduate
who has taken a QM class or two. It's a much easier read than Ballentine).
Actually, I did indeed obtain and study Isham some years ago
and it helped with a number of things I was grappling with
at that time.

I don't really know if Ballentine and Isham would agree with
everything I'm saying.
Isham still puts a certain amount of emphasis on collapse, IIRC.
(Correct me if I'm misremembering -- it's been a while.)

But... from the previous posts... I get the feeling you haven't

Ballentine (IIUC) essentially rejects collapse, and justifies
this well with a lot of carefully analyzed examples.
But his approach is also not merely "shut up and calculate".
I think of Ballentine more as "stop being silly, analyze
lots of different cases in detail, and adopt the minimal
interpretation that's consistent with all of them and their
associated experimental verifications."

In this respect, I now consider Ballentine's approach
superior to Isham's (though Isham's textbook is still
worth reading). I wish there was a series of textbooks
taking the Ballentine approach from initial undergrad

Fredrik
Staff Emeritus
Gold Member
But... from the previous posts... I get the feeling you haven't
...
Ballentine (IIUC) essentially rejects collapse, and justifies
this well with a lot of carefully analyzed examples.
Yes, you're right. I have still only read a pretty small part of it, but one of the parts I've read is 9.1-9.3, which is where he "essentially rejects collapse, and justifies this well". Hm, maybe you got the impression that I haven't read those sections because I said "we have arguments for this, but no proof at this time", while Ballentine's argument certainly looks like a proof. I guess I'm just feeling a little reluctant to call it a proof, considering that there are many smart people who still prefer a realist interpretation. If it's that easy to dismiss realism, then why haven't they?

Regarding our previous discussions, I have found a good book on functional analysis and started to read it: "Functional analysis: spectral theory", by V.S. Sunder. It's easier to read than the other one I got (Conway), and covers all the spectral theorems (and stuff like represesentations of C*-algebras, GNS construction, etc.) in half the number of pages. Unfortunately I'm so dumb that I'll be stuck studying the appendices about topological spaces, measures and integration, for a long time before I get to the main text. (I already know some of it, but it's still taking a lot of time).

Isham still puts a certain amount of emphasis on collapse, IIRC.
(Correct me if I'm misremembering -- it's been a while.)
Isham makes an effort to always explain how both of the main schools of thought (which he describes as "realist" and "anti-realist") think about the topic at hand. I'm getting a strong impression that he's in the "anti-realist" camp himself, but I don't think he ever says that he thinks the realists are wrong, like Ballentine does. He does however point out some problems with the realist view here and there.

In this respect, I now consider Ballentine's approach
superior to Isham's (though Isham's textbook is still
I guess I do to, since I find Ballentine's argument convincing. The main reason I'm recommending Isham is that it's a much easier read, especially if you like to read a book cover to cover. But sections 9.1-9.3 in Ballentine are pretty easy too.

A single copy of a system is not necessarily associated
with a pure state. The state operator -- (I say "operator"
since this concept encompasses mixed states as
well and is thus more general) -- is more like the quantum
version of a probability distribution. I now find it more
helpful to think in terms of such (mixed) state operators
rather than superpositions.
I suspect that this "statistical interpretation" is the main reason why a lot of people these days prefer to say that states are represented by statistical operators than by state vectors. In this interpretation, state vectors represent ensembles too (the "pure" ensemble that consists of the identically prepared systems that get measured when you do the same experiment many times, each time on a single system).

If both of these mathematical objects (state vectors and statistical operators) represent ensembles, and the statistical operators represent the most general type of ensemble, then we might as well say that these operators are the true "states" of our theory. (Comment for other people who read this: Note that there's an obvious bijection between the state vectors and a subset of the statistical operators. $|\psi\rangle$ corresponds to $|\psi\rangle\langle\psi|$).

Last edited: