- #1

- 2,349

- 331

I started this thread to continue the following discussion from the thread:

https://www.physicsforums.com/showthread.php?p=2203566#post2203566"

But you and Weinberg are both making a mistake in assuming ontological status to both the collapse of the wave function and to entanglement within the Copenhagen interpretation (CI). It is a subtle mistake but is there in the distinction being made between a

Yes the measuring device and observer are fundamentally more accurately described quantum mechanically but there is no

Yes the act of measurement is an

This is not to say entanglement is an illusion, it has real physical consequences but these are again no different qualitatively from classical correlation.

Now as to Weinberg's critique of CI, the quote cited:

Where he says "this is surely wrong..." he makes the mistake of identifying the laws governing the behavior of the physical system with the choice of the

Where you and He state that the act of measurement is treated classically in CI "which is obviously false" the statement itself is false. It is the

When he speaks of the wave-function evolving in a perfectly deterministic way and asks from where the quantum probabilities come, he is again confusing subtly the wave-function with the physical system. The probabilities were there from the beginning. They are what the wave-function is modeling. It is not qualitatively different from a classical deterministic evolution of a

As to his problem with CI in quantum cosmology, I assert that it is the problem with quantum cosmology not CI in that it is meaningless to describe a wave-function for the "universe as a whole" beyond a very trivial one of a 1 dimensional mode space "it exists!". All other measurements must be comparitive and thus applied to a part of the universe w.r.t. the rest of the universe. But that's a whole other thread in itself.

Finally as to a quantum treatment of the measurement process. First remember that measurement is a dissipative process. It is fundamentally an act of amplification and requires an entropy dump (heat sink) The meta-system of system+measuring device must then be described within QM using density operators wherein both classical and quantum probabilities may be expressed. You also must describe the coupling to the entropy dump but you fundamentally cannot describe the entropy dump itself. It is by definition beyond observation. One at best introduces a noisy component to the Hamiltonian of the system to model such a coupling.

So you set up your act of measurement... the meta-system evolves and quickly decoheres into a composite system with classical correlation between the measuring device with the appropriate record of possible outcomes in a classical set of probabilities, which are correlated with the various "collapsed" wave-functions for the measured system. All that entanglement gets shifted into the entropy dump.

If you wish you can model the entropy dump in part but when you do the partial trace over it to get the reduced density operator for meta-system you get the same situation.

At this point you choose one outcome in exactly the same sense as a classical outcome of a classical random variable. The result is the choice of one measurement with the corresponding "collapsed" wave function to describe future behavior of the system. The CI prescription just skips the busywork since we only care about our description of the system in hand.

This is the measurement process and you can't alter it significantly without invalidating the device as a

Within CI there is no

[PS: I ought to write all this up in formal terms with proof or citations to proof of the assertions I am making. I have the summer off and that sounds like a good project! I can never find all the pieces in one place.]

https://www.physicsforums.com/showthread.php?p=2203566#post2203566"

alxm said:The 'collapse' of the wave-function is just a weird Copenhagen-interpretation way of looking at things that makes the false assumption that a measurement is performed independently of the system being measured.

In reality two interacting systems cannot be separated, so I see no reason to believe wave functionsevertruly 'collapse' in the Copenhagen sense.

jambaugh said:I believe you are misinterpreting the CI. In CI the wave functions collapse is not qualitatively different from the classical analogue of updating a classical probability distribution given new information about the system. For example prior to the drawing the distribution for all tickets in a simple lottery is uniform. After the drawing it "collapses" to 100% for the winning ticket and 0 for the rest.

It is unfortunate that the term "collapse" is used. If you replace "collapse of the wavefunction" with "update of the wavefunction" in all texts you then get the correct application of the CI.

Now you are welcome to disagree with CI but please don't misrepresent it.

alxm said:I understand what you're saying, but I don't see how this contradicts anything I said.

You're repeating the same underlying assumption, phrased differently. The point was: Youcan't haveinformation about the system independently of the system. Say you have a system that's a superposition of two states: [tex]|\psi>_{measured} = |0>_{measured} + |1>_{measured}[/tex]. You're saying that you perform a 'measurement' and the state becomes either |0> or |1>.Howdo you measure a system? By interacting with it.

The result of such an interaction, when you model itentirely quantum-mechanicallyis an entangled state between the 'measuring' and 'measured' systems. You don't really gain any information from interacting at the quantum level. Which is why the Copenhagen Interpretationassumesclassical measurement. That assumption is obviously false. In which case you have to ask where this 'collapse' supposedly comes from. That isn't to say it doesn't work, I already said it does. I'm saying it's simply not possible for it to be a true picture of what's going on, since the assumption it's based on is known to be false.

What I'm talking about is essentially what Stephen Weinberg is talking about in the http://en.wikipedia.org/wiki/Copenhagen_interpretation" on WP's "Copenhagen interpretation" page.

But you and Weinberg are both making a mistake in assuming ontological status to both the collapse of the wave function and to entanglement within the Copenhagen interpretation (CI). It is a subtle mistake but is there in the distinction being made between a

**classical**and**quantum mechanical**description of the measuring device.Yes the measuring device and observer are fundamentally more accurately described quantum mechanically but there is no

**physical**barrier between quantum and classical description across which information must traverse.Yes the act of measurement is an

**interaction**and creates entanglement between measuring device and measured system. But*entanglement*itself is not a physical property of a system but rather a property of the system's description, in particular how the system is subdivided into component subsystems.This is not to say entanglement is an illusion, it has real physical consequences but these are again no different qualitatively from classical correlation.

Now as to Weinberg's critique of CI, the quote cited:

Steven Weinberg in "Einstein's Mistakes", Physics Today, November 2005, page 31, said:

All this familiar story is true, but it leaves out an irony. Bohr's version of quantum mechanics was deeply flawed, but not for the reason Einstein thought. The Copenhagen interpretation describes what happens when an observer makes a measurement, but the observer and the act of measurement are themselves treated classically. This is surely wrong: Physicists and their apparatus must be governed by the same quantum mechanical rules that govern everything else in the universe. But these rules are expressed in terms of a wave function (or, more precisely, a state vector) that evolves in a perfectly deterministic way. So where do the probabilistic rules of the Copenhagen interpretation come from?

Considerable progress has been made in recent years toward the resolution of the problem, which I cannot go into here. It is enough to say that neither Bohr nor Einstein had focused on the real problem with quantum mechanics. The Copenhagen rules clearly work, so they have to be accepted. But this leaves the task of explaining them by applying the deterministic equation for the evolution of the wave function, the Schrödinger equation, to observers and their apparatus.

The problem of thinking in terms of classical measurements of a quantum system becomes particularly acute in the field of quantum cosmology, where the quantum system is the universe.

Where he says "this is surely wrong..." he makes the mistake of identifying the laws governing the behavior of the physical system with the choice of the

**description**of the system. The classical description of the Moon is not "wrong" because it fundamentally obeys quantum mechanical rules. It is rather less than maximal. However the critical issues is that we desire to speak in absolute terms about a specific outcome of a measurement. This dictates a classical description of the record of this measurement. There must be a classical/quantum cut between the piece of paper on which one writes down the result and the actual system for which this measurement is recorded.Where you and He state that the act of measurement is treated classically in CI "which is obviously false" the statement itself is false. It is the

**record**of the measurement which is treated classically. (Or at worst the gross variables of the position and velocity of the measuring device where applicable). The act (those variables of the measuring device which interact with the system) is left without detailed description except that it must obey the rules predicted by QM, i.e. one must be able to re-measure the system immediately and obtain the same result for the act to be called a "measurement".When he speaks of the wave-function evolving in a perfectly deterministic way and asks from where the quantum probabilities come, he is again confusing subtly the wave-function with the physical system. The probabilities were there from the beginning. They are what the wave-function is modeling. It is not qualitatively different from a classical deterministic evolution of a

*classical probability distribution in phase space*. One obviously doesn't puzzle over whence those probabilities arise. Why should one in the QM case?As to his problem with CI in quantum cosmology, I assert that it is the problem with quantum cosmology not CI in that it is meaningless to describe a wave-function for the "universe as a whole" beyond a very trivial one of a 1 dimensional mode space "it exists!". All other measurements must be comparitive and thus applied to a part of the universe w.r.t. the rest of the universe. But that's a whole other thread in itself.

Finally as to a quantum treatment of the measurement process. First remember that measurement is a dissipative process. It is fundamentally an act of amplification and requires an entropy dump (heat sink) The meta-system of system+measuring device must then be described within QM using density operators wherein both classical and quantum probabilities may be expressed. You also must describe the coupling to the entropy dump but you fundamentally cannot describe the entropy dump itself. It is by definition beyond observation. One at best introduces a noisy component to the Hamiltonian of the system to model such a coupling.

So you set up your act of measurement... the meta-system evolves and quickly decoheres into a composite system with classical correlation between the measuring device with the appropriate record of possible outcomes in a classical set of probabilities, which are correlated with the various "collapsed" wave-functions for the measured system. All that entanglement gets shifted into the entropy dump.

If you wish you can model the entropy dump in part but when you do the partial trace over it to get the reduced density operator for meta-system you get the same situation.

At this point you choose one outcome in exactly the same sense as a classical outcome of a classical random variable. The result is the choice of one measurement with the corresponding "collapsed" wave function to describe future behavior of the system. The CI prescription just skips the busywork since we only care about our description of the system in hand.

This is the measurement process and you can't alter it significantly without invalidating the device as a

**measuring device**.Within CI there is no

**measurement problem**. It only becomes a problem when you reify the wave function in one of the alternative interpretations.[PS: I ought to write all this up in formal terms with proof or citations to proof of the assertions I am making. I have the summer off and that sounds like a good project! I can never find all the pieces in one place.]

Last edited by a moderator: