Can a conscious observer collapse the probability wave?

In summary, there is debate about whether a conscious observer is necessary to collapse the wave function in quantum mechanics. However, there is no experimental evidence indicating that a conscious observer is the key in collapsing the wave function. It is the recording of information that determines collapse, and human memory is not a reliable recording device. Therefore, a conscious observer may not be an effective means of collapsing the wave function. In experiments, an interference pattern is expected for particles that cannot be remembered which path they went through, and a non-interference pattern for those that can be remembered.
  • #71
The basic question is simple: does a truly isolated system, regardless of size, really evolve via the Schroedinger equation, or doesn't it? There is no way out-- this question must be answered, and it makes no difference if one takes an instrumentalist or realist view, the question persists.
 
Physics news on Phys.org
  • #72
The answer is yes - the total system - environment, system being measured, and measuring apparatus - does evolve by the Schrodinger equation. However via decoherence phase leaks to the environment transforming the pure state into a mixed state. The mixed state can be interpreted as being in an eigenstate of the measurement apparatus - but only probabilities can be assigned - we do not know which one. The arbitrariness of the pure states a mixed state can be decomposed into is removed by the definiteness of the possible states of the measurement apparatus.

http://arxiv.org/pdf/quant-ph/0312059v4.pdf
The reduced density matrix looks like a mixed state density matrix because, if one actually measured an observable of the system, one would expect to get a definite outcome with a certain probability; in terms of measurement statistics, this is equivalent to the situation in which the system is in one of the states from the set of possible outcomes from the beginning, that is, before the measurement. As Pessoa (1998, p. 432) puts it, “taking a partial trace amounts to the statistical version of the projection postulate.”

This does not resolve the measurement problem because it does not explain how a particular outcome is selected. But for all practical purposes it does because there is no way to observationally distinguish the two - one where it is in a definite state and you can predict which one it is and one where it is a definite state and all you can predict is probabilities.

Regarding the reality of a system state it is not possible to have an observable that tells what state a system is in - for pure states you can but for mixed states you can't. This suggests to me its like probabilities - not something that is real but rather a codification of knowledge about the system. It does not prove it is not real either - it simply seems more reasonable not to assume it - but opinions are like bums - everyone has one - it does not make it right. The problem of a real system state collapsing via measurement is solved by decoherence.

Thanks
Bill
 
  • #73
Ken G said:
The basic question is simple: does a truly isolated system, regardless of size, really evolve via the Schroedinger equation, or doesn't it? There is no way out-- this question must be answered, and it makes no difference if one takes an instrumentalist or realist view, the question persists.

Is this really controversial? That an isolated system evolves according to the SE has been the implicit assumption in a vast number of models and agrees with every experiment I know of; the better you isolate your system the more if behaves like an ideal QM system.

Furthermore, nowadays we've reached a point where when a system does NOT evolve accoring to the SE we often now why, i.e. we understand the interactions with the environment quite well (which doesn't neccesarily mean that we know how to reduce them). It is this understanding which has allowed us to e.g. push the coherence time of solid state qubits from tens of nanoseconds ten years ago, to hundreds of microseconds today.
 
  • #74
f95toli said:
Is this really controversial? That an isolated system evolves according to the SE has been the implicit assumption in a vast number of models and agrees with every experiment I know of; the better you isolate your system the more if behaves like an ideal QM system. Furthermore, nowadays we've reached a point where when a system does NOT evolve accoring to the SE we often now why, i.e. we understand the interactions with the environment quite well (which doesn't neccesarily mean that we know how to reduce them). It is this understanding which has allowed us to e.g. push the coherence time of solid state qubits from tens of nanoseconds ten years ago, to hundreds of microseconds today.

Exactly. To me its simply the modern view as detailed in the paper by Schlosshauer I linked to. Really scratching my head why its not more or less the generally accepted wisdom and why discussions still go on about it. One almost gets the feeling some want it to be more complicated than it really is.

I have read when Wigner first heard about how decoherence solved the measurement problem from some early papers by Zurek he recognised immediately it removed the necessity for ideas like consciousness causing collapse etc he was partial to. Since then we have deepened our understanding but the basic message seems to be the same - the measurement problem now largely has been solved. Issues do remain and research seems ongoing but as far as I can see the more 'mystical' ideas such as consciousness causing collapse no longer have traction.

Thanks
Bill
 
  • #75
Decoherence does not solve the measurement problem. It neither solves the reduction to a single observed state nor does it explain the Born rule. Claims that it does are based on a misinterpretation of the meaning of the density operator constructed by tracing over the environment.

See my blog at http://aquantumoftheory.wordpress.com for how the measurement problem can be approached in a more coherent way.
 
  • #76
Jazzdude said:
Decoherence does not solve the measurement problem. It neither solves the reduction to a single observed state nor does it explain the Born rule. Claims that it does are based on a misinterpretation of the meaning of the density operator constructed by tracing over the environment.

I don't think it explains the Born rule - but I believe Gleasons Theorem does - unless you really want to embrace contextuality. I have been carefully studying Schlosshauer's book on decoherence and feel confident the quote I gave is correct. If not there has been some hard to spot error a lot of people missed - possible of course - but that would not be my initial reaction. It most certainly does not explain how a particular state is singled out but it does explain how it is in an eigenstate prior to observation

Mind giving us a cut down version of exactly where the math of tracing over the environment fails?

Thanks
Bill
 
Last edited:
  • #77
bhobba said:
I will look it up but please answer me a simple question. Given the mixed state 1/2 |a><a| + 1/2 |b><b| what is the corresponding observable that will tell us it is in that state? And if you can't come up with one why do you think its real?

Added Later:

Looked it up - could not find any article using it as evidence a state is real. Exactly why do you believe it proves it?

Thanks
Bill
What you claimed was a “mixed state” is really a projection operator. Projection operators aren't states at all. Projection operators can be described by "defective" matrices and states can be described as vectors. By defective, I mean that the projection operator doesn't have as many as many linearly independent eigenvectors as it has eigenvalues. In any case, what you wrote can't be a state. I think that I know what you meant, though.
I assume that what you meant is the two photon state “|a>|a>+|b>|b>” which isn’t a mixed state either. However, it is at least a state. I think the question that you were trying to ask is what corresponding observable will tell us if two particles are actually in that entangled state.
If this is what you are asking, then you really want to know how to construct a Bell state analyzer. I will address that question. If I misunderstood your question, then no harm done.
The expression that you intended to write describes a two boson entangled state where two bosons are in the same single photon state. There are at least three other entangled states with different expressions. These are called boson Bell states. Hypothetically, one can determine whether a two photon state is in one of the Bell states or in a mixed state.
For completeness, I will write down the four Bell states. This way, we can discuss the experiments easier.
The letter “a” will represent the horizontal polarization vector and “b” will represent the vertical polarization vector.
A=|a>|a>
B=|b>|b>
C=|a>|a>+|b>|b>
D=|a>|b>+|b>|a>
These are called the Bell states. The Bell state that you presented is C.
One can build an environment where the four states are separately stationary. Stationary means the probability of being in this state is independent of time and trial number. A mixed state would not be stationary. The probability of paired photons being in anyone of the four states changes with time in a mixed state.
The precise definition of horizontal and vertical varies with the geometry of the measuring instrument. However, given an ideal apparatus these states are unambiguous. A mixed state with two bosons would be a superposition of at least two of these four states.
Any two photon quantum state can be expressed as,
E=wA+xB+yC+zD.
Determining w, x, y and z would involve making coincidence measurements with polarizers and mirrors. If anyone of these parameters equals 1, and the others 0, then E is identified with one of those states. The more two photon coincidences detected, the greater the precision of the measured parameters. A mixed state would involve any of these four parameters being between 0 and 1, noninclusive. I will give some references concerning the experimental determination of the state of a two photon system. Some of the articles will provide a schematic of the apparatus they used. The experimental protocol will also be used.
A Bell state analyzer is a device for determining the state of a two photon system. Descriptions of the apparatus are shown in each article. Diagrams of the apparatus are shown in the next two articles.
http://arxiv.org/pdf/quant-ph/0410244v2.pdf
“Experimental Realization of a Photonic Bell-State Analyzer
Efficient teleportation is a crucial step for quantum computation and quantum networking. In the case of qubits, four different entangled Bell states have to be distinguished. We have realized a probabilistic, but in principle deterministic, Bellstate analyzer for two photonic quantum bits by the use of a non-destructive controlled-NOT (CNOT) gate based on entirely linear optical elements. This gate was capable of distinguishing between all of the Bell states with higher than 75% fidelity without any noise substraction due to utilizing quantum interference effects.”

http://www.univie.ac.at/qfp/publications3/pdffiles/1996-04.pdf
“We present the experimental demonstration of a Bell-state analyzer employing two-photon interference effects. Photon pairs produced by parametric down-conversion allowed us to generate momentum-entangled Bell states and to demonstrate the properties of this device. The performance obtained indicates its readiness for use with quantum communication schemes and in experiments on the foundations of quantum mechanics.”

Here is some theory. By theory, I mean a hypothetical description of the experiment.
http://en.wikipedia.org/wiki/Bell_test_experiments
“Bell test experiments or Bell's inequality experiments are designed to demonstrate the real world existence of certain theoretical consequences of the phenomenon of entanglement in quantum mechanics which could not possibly occur according to a classical picture of the world, characterised by the notion of local realism. Under local realism, correlations between outcomes of different measurements performed on separated physical systems have to satisfy certain constraints, called Bell inequalities. “
mous Bell inequality.”

A theoretical discussion on the Bell states is given here.
http://en.wikipedia.org/wiki/Bell_state
“The Bell states are a concept in quantum information science and represent the simplest possible examples of entanglement. They are named after John S. Bell, as they are the subject of his famous Bell inequality.”
 
Last edited:
  • #78
Ken G said:
The basic question is simple: does a truly isolated system, regardless of size, really evolve via the Schroedinger equation, or doesn't it? There is no way out-- this question must be answered, and it makes no difference if one takes an instrumentalist or realist view, the question persists.
According to decoherence theory, the isolated system containing environmental system and probed system really evolve by Schroedinger equation. The "randomness" of the measured results corresponds to unknown phases in the environmental system. There is an assumption here that there are far more unknown phases in the environmental system then in the measured system. Thus, the environment is considered complex.
One question that I haven't entirely satisfied in my own mind is why you can't consider the unknown phases as "hidden variables". The answer, to the degree that I understand it, is that the unknown phases in the decoherence model do not have the properties of a "hidden variables" defined in Bell's Theorem. When Bell proved that "hidden variables" do not explain quantum mechanics, he carefully defined "hidden variable" in a mathematically formal way. However, the phases of the waves in decoherence theory are "variables" and they are "hidden" in the broadest meaning of the words.
I am not sure, so I would like someone else to comment. Maybe somebody could answer my questions.
1) Why can't the unknown phases in the environment of the probed system be considered "hidden variables"?
2) Why isn't "decoherence theory" ever called a "hidden variable" theory?
 
  • #79
Darwin123 said:
What you claimed was a “mixed state” is really a projection operator.
No, because squaring it doesn't yield the same operator. It is a weighted sum of projection operators, which is a special form of a mixed state operator. You don't seem to be familiar with the density matrix formalism which is essential for talking about decoherence in modern terms.

Darwin123 said:
Why isn't "decoherence theory" ever called a "hidden variable" theory?
Because the environment isn't in a pure state either, and knowing its state doesn't help you to further specify the state of the system.
 
Last edited:
  • #80
We need to be a bit careful about when we talk about "decoherence theory". It is important to understand that this is NOT an interpretation (although elements of it can of course be used to formulate interpretations if you are interested, which I am not)
Hence, I don't think anyone claims that it solves all philosophical problems with QM. However, what it DOES do is to give us quantitative ways of modelling decoherence of quantum systems.
Or, in other words, its predictions matches experimental data.

Me and everyone I know who tries to increase make their systems behave "more quantum mechanically" (i.e. increase coherence times and so on) have as a working assumptions that the reason for why we can't keep the cat half-dead (so to speak) for as long as we want, is because we are not yet good enough at controlling the interaction with the environment.
Furthermore, the fact that we often model this (say using a Lindbladian) with some phenomenological times T1, T2, T2* etc does not mean that we are appealing to some unknown mechanisms. We quite often have quite a good idea of what is limiting us: in solid state systems it would be e.g. two-level fluctuators in the substrate, itinerant photons because of inadequate filtering etc; in ion-microtraps heating of the metallization etc.

There are LOTS of us out there working to solve these problems, and the vast majority of us have very little interest in the "philosophy of QM": we just want our devices to work better and "decoherence theory" gives us a route for improvement.
 
  • #81
f95toli said:
There are LOTS of us out there working to solve these problems, and the vast majority of us have very little interest in the "philosophy of QM": we just want our devices to work better and "decoherence theory" gives us a route for improvement.

That sounds like good philosophy. I don't think the "Copenhagen interpretation" gives us a route for improvement.
 
  • #82
bhobba said:
I don't think it explains the Born rule - but I believe Gleasons Theorem does - unless you really want to embrace contextuality. I have been carefully studying Schlosshauer's book on decoherence and feel confident the quote I gave is correct. If not there has been some hard to spot error a lot of people missed - possible of course - but that would not be my initial reaction. It most certainly does not explain how a particular state is singled out but it does explain how it is in an eigenstate prior to observation

Mind giving us a cut down version of exactly where the math of tracing over the environment fails?

First of all, I have read Schlosshauer's publications too, and Wallace' and Zurek's and all the others. And I still disagree with what you say essentially, and I'm not the only one who does.

First of all, Gleason's theorem doesn't really help in deriving the Born rule. It just asserts that the Born rule is the only rules that fulfills some more or less sensible constraints. But it does not explain at all how a measurement is realized, where the randomness would come from and why the states are reduced.

Decoherence also does not explain any of that. It only explains that the ability to interfere is lost after a suitable interaction with the environment. And it specifically says nothing about systems being in eigenstates, unless you make additional, and questionable, assumptions.

There is nothing wrong with tracing over the environment, but you have to careful with the interpretation of the result of this operation. A state that is reduced by tracing over the environment is the best option for representing the remaining information in the subsystem, it is not an ensemble. And it also not indistinguishable from an ensemble in the first place. Only after you introduce the measurement postulate you can sensibly construct density operators that represent properties of ensembles and then argue that with the measurement postulate a density operator from tracing and one from an ensemble construction are indistinguishable under a measurement. Any attempt to use this to make a statement about decoherence as an essential ingredient of the measurement process will therefore result in a circular argument. This is know and has been discussed often enough. Some experts in the field don't want to see it, just like others stick to their own explanation. There are good arguments against all proposed solutions of the measurement problems and no agreement at all. So you question why the discussion continues has a simple answer: Because the problem has not been solved yet.
 
  • #83
kith said:
No, because squaring it doesn't yield the same operator. It is a weighted sum of projection operators, which is a special form of a mixed state operator. You don't seem to be familiar with the density matrix formalism which is essential for talking about decoherence in modern terms.


Because the environment isn't in a pure state either, and knowing its state doesn't help you to further specify the state of the system.
I was thinking in terms of the Schroedinger picture rather than the Heisenberg picture.
In the Schroedinger picture, it is the vector that evolves in time. The vector represents a physical system that is "rotating" in time. The rotation represents changes in the physical system.
The operator in the Schroedinger system is stationary. The operator represents an operation on the coordinates. One could say that it represents an set of mathematical axes rather than a physical system. In the Schroedinger picture, a projection operator could represent a measurement.
The density matrix that you are talking about would result from the projection operator being sandwiched between world vector and the world vector. The members of the density matrix are the matrix elements. The numerical elements of these matrix elements are invariant to the picture (Schroedinger or Heisenberg) are being used. The matrix elements are considered physical. The probabilities are actually determined by the matrix elements.
The operator is not precisely the same as the density matrix. In any case, I was wrong to call you wrong. You are using a Heisenberg picture, not a Schroedinger picture. Everyone was talking about the Schroedinger equation, so I assumed that you would be using the Schroedinger picture.
In any case, I think that more and more of the system is being included in the vector rather than the operators. I mean by this that physicists are including more and more of the experimental apparatus in the Schroedingers equation. So they now have better "rules" for deciding what is the measurement and what is the system.
That is all that I meant by "decoherence theory." If the trend continues for including parts of the experimental apparatus in the wave equation, then eventually they may get to the point where both apparatus and system are analyzed by some form of Schroedingers equation.
Or the physicists won't go the entire way. Then "decoherence theory" will "only" be a route to improving the measurements.
In the example that I gave in another post, the evolution of a C60 molecule was modeled with the thermal radiation that it gives off. In previous decades, the thermal radiation would merely have been considered a part of the "classical" apparatus. So I think that mostly answers your question. In principle, the composite system including environment and subsystem satisfies Schroedinger's equation.
Everything is all waves, and there are no particles!
 
  • #84
Darwin123 said:
I was thinking in terms of the Schroedinger picture rather than the Heisenberg picture.
...
In any case, I was wrong to call you wrong. You are using a Heisenberg picture, not a Schroedinger picture. Everyone was talking about the Schroedinger equation, so I assumed that you would be using the Schroedinger picture.

You are entirely on the wrong track. This has nothing to do with Schroedinger or Heisenberg picture. Density operators are a generalization of state vectors to allow for the description of the time evolution of subsystems (tensor factor spaces) of a unitarily evolving system. They are also used to represent classical ensembles of quantum states in a way that is compatible with the measurement postulate.
 
  • #85
Darwin123 said:
What you claimed was a “mixed state” is really a projection operator. Projection operators aren't states at all.

By definition a state is a positive operator of trace 1. A projection operator is such an operator and is a special type known as a pure state. The rest are called mixed states and it can be proved they are the convex sums of pure states ie of the form sum ai |bi><bi>. As mentioned before the average of an observable R is trace (pR) where p is the system state. If p is a pure state |a><a|then Trace (|a><a|,|a><a|) = 1 ie we have an observable, namely |a><a|, that will tell us with 100% certainty if a system is in that pure state and its outcome is the same pure state. In that sense you can consider a pure state as real. But the same does not apply to a mixed state as you will see if you work through the math of say 1/2 |a><a| + 1/2 |b><b|.

Thanks
Bill
 
Last edited:
  • #86
Jazzdude said:
First of all, I have read Schlosshauer's publications too, and Wallace' and Zurek's and all the others. And I still disagree with what you say essentially, and I'm not the only one who does.

Having discussed the issue here and elsewhere there are those that do disagree with me for sure - but there are plenty that do.

Jazzdude said:
First of all, Gleason's theorem doesn't really help in deriving the Born rule. It just asserts that the Born rule is the only rules that fulfills some more or less sensible constraints. But it does not explain at all how a measurement is realized, where the randomness would come from and why the states are reduced.

Gleason's theorem shows, provided you make sensible assumptions, that the usual trace formula follows. Of course it does not explain the mechanism of collapse but I do believe it does explain why randomness enters the theory. Determinism is actually contained in a probabilistic theory - but the only probabilities allowed are 0 or 1. That assumption though is inconsistent with the trace formula - that of course is the Kochen-Specker theorem but follows quite easily from Gleason.

Jazzdude said:
Decoherence also does not explain any of that. It only explains that the ability to interfere is lost after a suitable interaction with the environment. And it specifically says nothing about systems being in eigenstates, unless you make additional, and questionable, assumptions.

I can't follow you here. From the paper posted before:
'Interaction with the environment typically leads to a rapid vanishing of the diagonal terms in the local density matrix describing the probability distribution for the outcomes of measurements on the system. This effect has become known as environment-induced decoherence, and it has also frequently been claimed to imply at least a partial solution to the measurement problem.'

Since the off diagonal elements are for all practical purposes zero it is a mixed state of eigenstates of the measurement apparatus. By the usual interpretation of a mixed state that means it is in a eigenstate but we do not know which one - only probabilities. It has been pointed out, correctly, that for mixed states it is not uniquely decomposable into pure states so that is not a correct interpretation. However it is now entangled with the measurement apparatus whose eigenstates ensure it can be so decomposed.

Now from what you write is your issue with the above is that since it invokes the Born rule it somehow is circular? I have read that before but can't really follow it. I do not believe decoherence explains the Born rule - I have read the evarience argument and do not agree with it - but base my explanation of it on Gleason. What Gleason does is constrain the possible models the formalism of QM allows. It tells us nothing about how an observation collapses a state. But it does tell us what any model consistent with QM must contain and it is that I take as true in decoherence.

Thanks
Bill
 
Last edited:
  • #87
f95toli said:
We need to be a bit careful about when we talk about "decoherence theory". It is important to understand that this is NOT an interpretation (although elements of it can of course be used to formulate interpretations if you are interested, which I am not) Hence, I don't think anyone claims that it solves all philosophical problems with QM. However, what it DOES do is to give us quantitative ways of modelling decoherence of quantum systems. Or, in other words, its predictions matches experimental data.

Of course. It most definitely does NOT solve all the philosophical problems with QM - but it does resolve some of them such as those of Schrodinger's Cat ie with decoherence taken into account it definitely is alive or dead not in some weird superposition.

Thanks
Bill
 
  • #88
bhobba said:
Gleason's theorem shows, provided you make sensible assumptions, that the usual trace formula follows. Of course it does not explain the mechanism of collapse but I do believe it does explain why randomness enters the theory. Determinism is actually contained in a probabilistic theory - but the only probabilities allowed are 0 or 1.

It's very unclear why any of the assumptions of Gleason's theorem are mandatory in quantum theory, even why we should assign probabilities to subspaces at all. Your argument is therefore based on assumptions that I don't share. But even if I did, the lack of a mechanism that actually introduces the randomness spoils the result. Deterministic mechanisms don't just turn random because we introduce an ad-hoc probability measure.

'... This effect has become known as environment-induced decoherence, and it has also frequently been claimed to imply at least a partial solution to the measurement problem.'

Yes, claimed, but never sufficiently backed up.

Since the off diagonal elements are for all practical purposes zero it is a mixed state of eigenstates of the measurement apparatus. By the usual interpretation of a mixed state that means it is in a eigenstate but we do not know which one - only probabilities.

That's precisely where you argument goes wrong. A subsystem description of a single state has the same mathematical form as a mixed (or ensemble) state, but it is still a single state of a single system. Interpreting it as an ensemble of whatever constituents with classical probabilities is just wrong. Now some argue that it is not an ensemble, but that it is indistinguishable from an ensemble. And this arguments only holds if you already assume the full measurement postulate. So it does not contribute anything at all to solving the measurement problem


It has been pointed out, correctly, that for mixed states it is not uniquely decomposable into pure states so that is not a correct interpretation. However it is now entangled with the measurement apparatus whose eigenstates ensure it can be so decomposed.

That's the least of the problems, however even with entanglement the branches are not uniquely determined. The preferred basis problem is also not solved entirely, even though good general arguments do exist.


I do not believe decoherence explains the Born rule - I have read the evarience argument and do not agree with it - but base my explanation of it on Gleason. What Gleason does is constrain the possible models the formalism of QM allows. It tells us nothing about how an observation collapses a state. But it does tell us what any model consistent with QM must contain and it is that I take as true in decoherence.

Even then you must agree that the measurement problem is not solved. And just for the record, envariance and the similar decision theory based arguments do not work either. They contain assumptions that are practically equivalent to stating the Born rule.
 
  • #89
f95toli said:
We need to be a bit careful about when we talk about "decoherence theory". It is important to understand that this is NOT an interpretation (although elements of it can of course be used to formulate interpretations if you are interested, which I am not)

Well said, it is not an interpretation, it is an experimental fact of nature. The old quantum idea before decoherence was that it took a human being to collapse the wave function. The Born rule is just that, the statistical probability of such a state collapsing to give a real result, the stuff we call observables represented by Hermitian matrices. But since mas has not been around that long, it stands to reason that things have to collapse in natural ways and decoherence, first proven by Alan Aspect is the bridge between this dichotomy.
 
  • #90
Jazzdude said:
It's very unclear why any of the assumptions of Gleason's theorem are mandatory in quantum theory, even why we should assign probabilities to subspaces at all. Your argument is therefore based on assumptions that I don't share. But even if I did, the lack of a mechanism that actually introduces the randomness spoils the result. Deterministic mechanisms don't just turn random because we introduce an ad-hoc probability measure.

I fail to see your point. There are two types of models - stochastic (ie fundamentally random) and deterministic. If it was deterministic then you would be able to define a probability measure of 0 and 1 - which you can't do if Gleason's theorem holds. Do you really believe in contextuality and theories that have it like BM? But yes that is an assumption I make and adhere to.

Jazzdude said:
That's precisely where you argument goes wrong. A subsystem description of a single state has the same mathematical form as a mixed (or ensemble) state, but it is still a single state of a single system. Interpreting it as an ensemble of whatever constituents with classical probabilities is just wrong. Now some argue that it is not an ensemble, but that it is indistinguishable from an ensemble. And this arguments only holds if you already assume the full measurement postulate. So it does not contribute anything at all to solving the measurement problem

Since I accept as a given the measurement postulate that is not an issue. The advantage of the fact it is now a mixed state is the interpretation is different. A post I saw about it on this forum expressed it pretty well:
https://www.physicsforums.com/showthread.php?t=260622
'Seriously, a mixed state is an ensemble description. In fact, one of the peculiar things about the interplay between mixed state statistics and quantum statistics is that considering particles in a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state. Worse, there are *different* ensembles of *different* pure states which are all observationally indistinguishable from the "mixed state". What describes a mixed state, or all of these ensembles, is the density matrix rho.'

However in decoherence, as I mentioned, since the mixed state is in fact the tensor product of the system state and the possible states of the measurement apparatus it singles out one ensemble.

Jazzdude said:
And just for the record, envariance and the similar decision theory based arguments do not work either. They contain assumptions that are practically equivalent to stating the Born rule.

Totally agree.

Thanks
Bill
 
  • #91
Meselwulf said:
The old quantum idea before decoherence was that it took a human being to collapse the wave function.

Actually very few believed that - only Wigner, Von Neumann and their cohort. Wigner later abandoned it however.

Thanks
Bill
 
  • #92
bhobba said:
I fail to see your point. There are two types of models - stochastic (ie fundamentally random) and deterministic. If it was deterministic then you would be able to define a probability measure of 0 and 1 - which you can't do if Gleason's theorem holds. Do you really believe in contextuality and theories that have it like BM? But yes that is an assumption I make and adhere to.

Your determinism argument only makes sense if you want to assign probabilities to subspaces at all. Why should we? We know it makes sense because we observe it, but that's not a good reason for assuming it. Doing so introduces exactly what we really want to understand.


Since I accept as a given the measurement postulate that is not an issue. The advantage of the fact it is now a mixed state is the interpretation is different.

If you accept the measurement postulate then you cannot derive anything relevant to solving the measurement problem from using it. Because solving the measurement problem (even partly) means to explain the origin of the measurement postulate.

'Seriously, a mixed state is an ensemble description. In fact, one of the peculiar things about the interplay between mixed state statistics and quantum statistics is that considering particles in a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state. Worse, there are *different* ensembles of *different* pure states which are all observationally indistinguishable from the "mixed state". What describes a mixed state, or all of these ensembles, is the density matrix rho.'

Like I said, stating that a single reduced state described by a density operator is indistinguishable from an actual ensemble (no matter which realization) requires using the measurement postulate. So it does not help at all for saying anything about how measurement works. Decoherence does not solve the measurement problem, not even remotely, not with the Gleason theorem, not with MWI, just not at all.
 
  • #93
bhobba said:
Actually very few believed that - only Wigner, Von Neumann and their cohort. Wigner later abandoned it however.

Thanks
Bill

I always was of the opinion that to collapse the wave function it was essential to be wearing glasses with heavy frames, a skinny black tie, and a white lab coat.
 
  • #94
f95toli said:
There are LOTS of us out there working to solve these problems, and the vast majority of us have very little interest in the "philosophy of QM": we just want our devices to work better and "decoherence theory" gives us a route for improvement.
Good post! However, I never quite got what people mean when they talk about "decoherence theory". I know the theory of open quantum systems and how decoherence arises there. Is this equivalent to "decoherence theory" or is there more to it? If yes, what are the axioms of "decoherence theory"?
 
  • #95
Jazzdude said:
Your determinism argument only makes sense if you want to assign probabilities to subspaces at all. Why should we? We know it makes sense because we observe it, but that's not a good reason for assuming it. Doing so introduces exactly what we really want to understand.

Its from the postulate observables are Hermitian operators whose eigenvalues are the possible outcomes. The spectral theorem implies, since obviously the actual values are unimportant, the projection operators of the decomposition give the probability of getting that outcome. That is easy to see if you consider a function of the observable that gives its expectation. Although it is a stronger assumption than made by Gleason's Theorem you can in fact derive the standard trace formula from the simple assumption the expectations are additive as Von Neumann did in his proof against hidden variable theories. In fact that's the precise assumption Bell homed in on in his refutation - its not necessarily true of hidden variable theories.

Jazzdude said:
If you accept the measurement postulate then you cannot derive anything relevant to solving the measurement problem from using it. Because solving the measurement problem (even partly) means to explain the origin of the measurement postulate.

I don't get it - I really don't. The problem of observing a pure state is it discontinuously changes to an unpredictable state and the system can not be assumed to be in that state prior to observation. But, like the link I gave on mixed states said: 'a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state.'. Both are part of the measurement postulate but the second situation does not have the problems of the first such as in Schrodinger's Cat where the cat can be alive and dead at the same time prior to observation. Being in a mixed state it is either alive or dead. It does not solve all the problems - only some of them - but it does solve some of them.

Thanks
Bill
 
  • #96
Jazzdude, do you think the dBB interpretation solves the measurement problem?

For me, the measurement problem is mainly to explain collapse / the appearance of collapse and not necessarily to explain the Born rule. If we require an explanation for every probabilistic element of QM, we are implicitly assuming that the theory is deterministic.
 
  • #97
bhobba said:
I don't get it - I really don't.
I think this is a semantic issue. I use the following definitions:

measurement problem: explain collapse
measurement postulate: collapse + Born rule

If I get him right, Jazzdude wants to explain the measurement postulate while you want to solve the measurement problem.

/edit: I forgot the "oberservables are self-adjoint operators and outcomes are eigenvalues" part in the measurement postulate. This is probably not under doubt by Jazzdude.
 
Last edited:
  • #98
kith said:
Good post! However, I never quite got what people mean when they talk about "decoherence theory". I know the theory of open quantum systems and how decoherence arises there. Is this equivalent to "decoherence theory" or is there more to it? If yes, what are the axioms of "decoherence theory"?

It makes use of the standard postulates of QM - nothing new is required.

Thanks
Bill
 
  • #99
bhobba said:
It makes use of the standard postulates of QM - nothing new is required.
I think so, too. The question is why do people talk about decoherence theory in the first place and what does it include.
 
  • #100
kith said:
I think this is a semantic issue. I use the following definitions:

measurement problem: explain collapse
measurement postulate: collapse + Born rule

If I get him right, Jazzdude wants to explain the measurement postulate while you want to solve the measurement problem.

Maybe.

To me the measurement postulate is E(R) = Tr(pR) where p is the state. I assume its true. The measurement problem for a pure state follows from the postulate in that its easy to see if p is a pure state it will in general discontinuously change to another pure state. However if p is a mixed state of the outcomes of an observation then the interpretation of the postulate is different - it can be assumed to be in one of those states prior to observation with a certain probability. Because decoherence converts a pure state to a mixed state there is no discontinuous change of the state - it can be assumed to be in that state prior to observation. Because of that, as the link I gave said 'taking a partial trace amounts to the statistical version of the projection postulate.'.

If that doesn't do it I am afraid I will leave it to someone else - I am sort of pooped.

Thanks
Bill
 
  • #101
I think there's some fundamental confusion what exactly solving the measurement problem means. It means that you have to answer the quesitons what a measurement is, why possible measurement results are given by the spectra of hermitian operators, where the indeterminism comes from, why we observe a collapsed state, and why we observe the statistics as given by the Born rule.

In other words, you have to give reasons for all the measurement related statements in the postulates of canonical quantum theory. This comes down to deriving the measurement postulate and all associated structure from something simpler, ideally even from nothing but unitary quantum theory.

Specifically, I am not allowed to assume that observables are given by hermitian operators whose spectrum defines the possible outcomes, I'm not allowed to assume that density operators describe ensembles, etc.

Kith, I don't think that dBB solves the measurement problem, I don't think that any established theory does.
 
  • #102
Jazzdude said:
I think there's some fundamental confusion what exactly solving the measurement problem means. It means that you have to answer the quesitons what a measurement is, why possible measurement results are given by the spectra of hermitian operators, where the indeterminism comes from, why we observe a collapsed state, and why we observe the statistics as given by the Born rule.

Ahhhh. Yes - most definitely. With that view I agree with what you write. I have my own answers to such questions and decoherence is just one small part of it. Indeed such is known as an interpretation - my interpretation is the ensemble interpretation combined with decoherence.

Jazzdude said:
Kith, I don't think that dBB solves the measurement problem, I don't think that any established theory does.

Nor do I - my view doesn't solve all the issues - I simply like it because the problems it doesn't solve I find acceptable. As I often say all current interpretations suck - you simply choose the one that sucks the least to you.

Thanks
Bill
 
Last edited:
  • #103
bhobba said:
Nor do I - my view doesn't solve all the issues - I simply like it because the problems it doesn't solve I find acceptable. As I often say all current interpretations suck - you simply choose the one that sucks the least to you.



Problem is: decoherence + ensemble interpretation doesn't solve a single thing.
You got all the usual paradoxes and unanswered questions...
 
  • #104
Quantumental said:
Problem is: decoherence + ensemble interpretation doesn't solve a single thing. You got all the usual paradoxes and unanswered questions...

Obviously since I hold to it I don't agree. But you are not the only one to hold that view - indeed there are those who believe that the ensemble interpretation (with or without decoherence) is simply a restating of the math and should not even be given the title of an actual interpretation.

Thanks
Bill
 
  • #105
Jazzdude said:
I think there's some fundamental confusion what exactly solving the measurement problem means.
The term is not as clearly defined as you suggest and I don't think you are representing the mainstream view. Schlosshauer for example defines the measurement problem as the combination of "the problem of definite outcomes" and "the problem of the preferred basis" which is only a small part of your definition.

Jazzdude said:
Specifically, I am not allowed to assume that observables are given by hermitian operators whose spectrum defines the possible outcomes
Why not? What would be an "allowed" assumption for the observables? Why are functions on the phase space "allowed" and self-adjoint operators on the Hilbert space are not?

Jazzdude said:
In other words, you have to give reasons for all the measurement related statements in the postulates of canonical quantum theory.
We cannot give reasons for all measurement related statements in any scientific theory, because the theory has to say how the mathematical objects relate to experimentally observable quantities. I can only think of two reasons to question the validity of the postulates of a theory:
(1) the theory is not consistent
(2) there exists a simpler theory which makes the same predictions

(1) is arguably true for the postulates of orthodox QM, but the only contradiction is between unitarian evolution and collapse. So if we are able to explain collapse (and many interpretations accomplish this), the inconsistencies go away. (2) may be true, but as long as we haven't found this simpler theory, we cannot claim that the current theory needs an explanation
 

Similar threads

Replies
23
Views
2K
Replies
7
Views
1K
Replies
20
Views
2K
Replies
18
Views
2K
Replies
59
Views
3K
  • Quantum Physics
Replies
14
Views
1K
Replies
16
Views
1K
Replies
8
Views
2K
Replies
35
Views
4K
  • Quantum Physics
2
Replies
36
Views
1K
Back
Top