# Ballentine Quantum mechanics

1. Sep 23, 2014

### dyn

Hi. I have posted a few questions in the Quantum section and been advised to get the Ballentine book. I know there is a new edition coming out soon so I was waiting to get that. I already have a few other Quantum books Recently I have seen a few posts talking about controversial issues in Ballentine. My question is ; is Ballentine a good book to learn from considering its controversial side ? Where it is considered controversial is it just controversial or do some people consider it incorrect ? And where would that leave me if I used that knowledge in an exam ?

2. Sep 23, 2014

### dextercioby

It's controversial because it refutes the dogma of Copenhagian (typically textbook) interpretation (also people have an issue with his treatment of the quantum Zeno effect, afaik). It's simply your choice: follow 99% of the books, or follow Ballentine.

3. Sep 24, 2014

### vanhees71

I don't see, where Ballentine is controversial, but I'm biased since I personally think that the minimal statistical interpretation (sometimes also called ensemble interpretation) is the only scientifically necessary and confirmed interpretation of the quantum theoretical formalism. In my opinion there is no need to interpret more into the formalism than that. Particularly there's no need for the collapse hypothesis or the assumption that the wave function (which anyway is only well defined in non-relativistic QT) is a physical field in the classical sense. Even less there's the need for metaphysical speculations like many worlds, qubism, or de Broglie-Bohm mechanics. Ballentine gives very clear reasons for the minimal statistical interpretation.

Concerning the details wrt. the Quantum Zeno effect, I've to look at the book again first, before I can make a statement. To my knowledge this is a well-understood and demonstrated effect: Through the interaction between the quantum object and a measurement apparatus you keep an unstable (or metastable) quantum state longer than when you let the quantum object evolve without interaction with that apparatus. As far as I understand, there's no more to wonder about than about any other preparation process.

If you use Bohr's version of Copenhagen, it's very close to the Minimal Interpretation. The only thing, I'd abandon is the idea that there is a quantum and a classical realm with some cut. The problem with this is that this cut cannot be objectively defined. To my understanding, the classical behavior of macroscopic objects under usual everyday-life circumstances is an emergent phenomenon that can be understood from quantum theory via a coarse-graining procedure, i.e., if you look at macroscopic scales at a macroscopic object, what you see is a spatio-temporal average over many microscopic degrees of freedom, and these averaged macroscopic observables behave classical with a high degree of accuracy. From this point of view you don't have trouble to understand that macroscopic objects can be prepared in states, where they show quantum behavior (recently one managed to demonstrate quantum entanglement on the vibrational states of macroscopic diamonds at a macroscopic distance and this even at room temperature; well-known other quantum phenomena are superfluidity and super conductivity).

Last but not least I want to stress that there's no place for dogmas in the natural sciences. You have models or theories which make predictions about the outcome of observations in nature, given the information you have about the object of investigation, which can be objectively and repreducibly put to test these models or theories. Everything else has no place in science. Quantum theory in the minimal statistical interpretation is among the best tested models in this sense ever. This does not mean that it is dogmatically true. There may be observations in the future clearly and reproducibly contradicting the predictions of quantum theory. Then we have to look for a better theory. That's how science works.

4. Sep 24, 2014

### atyy

A minimal interpretation without a Heisenberg cut is probably not correct. Landau and Lifshitz have it, as does Weinberg's description of Copenhagen. The cut is subjective, and can be placed in several places.

Ballentine doesn't explicitly reject the cut, but by rejecting collapse and stressing only unitary evolution, his point of view would be consistent with rejecting a cut. If a cut is rejected, then one can make physical sense of the wave function of the universe, and the problems of Many-Worlds would be solved.

Here http://arxiv.org/abs/1007.3977 is an example of how in each calculation in a minimal interpretation, there has to be a notional cut. In that example, the events and the probabilities of events are invariant in the sense of classical special relativity, and lie on the "macroscopic", "real" side of the cut. The wave function and unitary evolution and collapse are just calculational tools to calculate the probabilities of events. The wave function is not necessarily real and lies on the "quantum" side of the cut.

My recommendation is to begin with the quantum mechanics of Dirac, Landau and Lifshitz, Weinberg, Haag, or more standard texts like Griffiths, Zettili, Shankar, Sakurai, Le Bellac, Cohen-Tannoudji, Diu and Laloe, and Nielsen and Chuang. Ballentine is obviously recommended by many excellent physicists, but it should not be read as a first book.

Last edited: Sep 24, 2014
5. Sep 24, 2014

### vanhees71

We agree about the last paragraph although I don't know all the books you list. For sure, Sakurai, Cohen-Tannoudji, and Weinberg are excellent books, also Dirac's, but there you have sometimes unconventional notation. I'm not sure about Griffiths. There come up sometimes questions in this forum (particularly the homework forum) which seem to indicate that Griffiths's book leads to confusion of its readers.

The above cited preprint, I've to study first. But from the first glance on it, I don't understand, why the author calls the registration of a photon on a photo plate (or more modern a CCD) as "collapse". It's an registration of a photon, but not what is understood as collapse of the quantum state, which usually is defined as that the state of the object is in an eigenstate of the self-adjoint operator that represents the measured observable. This example shows very clearly, how nonsensical this assertion about the collapse really is: In this case it would mean that the photon state collapses to an eigenstate of the position operator. Even if you have a massive particle, where a position operator is defined, no such state exists, because the position operator has only continuous spectral values, and no normalizable eigenstate exists. For a photon, which is a massless particle of spin 1, there is no such thing as a position operator in the strict sense at all. The only thing, which you can sensibly define is the probability for detecting a photon with a certain resolution given by the resolution of the photo plate or pixel density of your CCD. Also at the end you don't have left a photon at all but it's absorbed on the detector.

Also, where is here the "cut"? For sure, we treat the photo plate in a classical way, saying that a silver-salt crystal gets "blackened" through the photochemical reaction with the light (or single photon in the described quantum-eraser experiment). The example of the delayed-choice experiment also shows, how nonsensical the collapse hypothesis is, if you define it as a real process (I think we agree upon the interpretation of the wave function/quantum state as a descriptor about our probabilistic knowledge of future observations given the information about the system due to its preparation in this state and not as a "real" quantity in the sense of a classical field or something like it). Does the collapse occur at the moment, where the photon is absorbed by the detector or does it occur only when I, the observer, has taken notice about the registered position of thsi absorption process? Does that mean, that with choosing an subensemble by the "delayed choice" whether I want to find which-way information or whether I want to see interference, the collapse occurs, and I change the past by making this choice? I'd answer "no" to all these questions! The result of the mesaurement is fixed at the "moment" (a reaction of the photon with the detector which takes a finite time, which is very short on a macroscopic scale by the way) the photon is absorbed by the detector. The possibility to make a delayed choice is due to the entanglement of the two photons' polarization with the well-known 100% correlations of the single-photon polarization states although the latter are maximally indetermined (the reduced single-photon polarizability state is described by the maximum-entropy statistical operator $1/2 \mathbb{1}$). This correlation is not due to the measurement of the polarization on one photon but due to the preparation of the two photons at the very beginning. So in the minimal interpretation, where I do not argue with a collapse, no issues with causality occurs. I guess that's what the author of the paper is to state in different words, using the (imho very misleading) "collapse language".

6. Sep 24, 2014

### atyy

Yes, my concern with Ballentine is why does he differs from Landau and Lifshitz, Sakurai, Cohen-Tannoudji and Weinberg? My understanding is that where Ballentine differs significantly (not just a occasional silly errors that even Feynman made in his wonderful lectures on classical electrodynamics) from all these other texts, it is Ballentine that is wrong and the other texts like Landau and Lifshitz are right.

The basic idea is that if the measurement is simultaneous in one frame, it will be successive in another frame. In a frame in which the measurements are successive, the entangled state is changed to an unentangled state after the detection of one of the particles.

It is true that the projection postulate cannot deal with continuous variables. It is also true that the state immediately after need not be an eigenstate of the measured observable. However, the solution is not to reject the projection postulate, but to generalize it. A generalization is given in http://arxiv.org/abs/0706.3526 (Eq 2 and 3). In this view, if one does not make successive measurements, then state reduction is not required, and one only needs an observable (POVM). If one does make successive measurements, then one defines a rule of state reduction by defining a quantum instrument, which also defines an observable. An instrument defines a unique observable, but an observable does not define a unique instrument http://arxiv.org/abs/0810.3536 (section 6.2.2).

For the case where the photon is destroyed, strictly speaking, one should use a second quantized description so that the state after the measurement is the state with one photon less http://arxiv.org/abs/1110.6815 (R2 on p13).

The "exact" place where the cut is put is problematic and fuzzy, and is subjective being in different places for different observers and different experiments. But what Copenhagen does is to acknowledge the problem upfront, and say that well in any case we know when a macroscopic registration happened. Of course, Copenhagen also does not acknowledge the language of "delayed choice" as formally correct. But the important thing is to acknowledge the cut upfront, say it is a problem in principle, but has never been in practice. So after we acknowledge the problem, we just shut up and calculate and we keep on using this minimal interpretation until observations falsify quantum theory. If we don't acknowledge the cut upfront, then we are forced to consider unitary evolution of the universe, Many-Worlds etc. This is why putting a subjective cut is part of the minimal interpretation, so that Landau and Lifshitz put it right at the start of their book.

Last edited: Sep 24, 2014