Why is the pilot-wave theory controversial ? Is it?

  • Thread starter Thread starter nonequilibrium
  • Start date Start date
  • Tags Tags
    Theory
Click For Summary
The pilot-wave theory, also known as the de Broglie-Bohm interpretation of quantum mechanics, is considered controversial due to its deterministic nature, which challenges the widely accepted Copenhagen interpretation. While some physicists view it as a valid alternative, others question its scientific legitimacy, particularly regarding its implications for observability and the nature of measurements. The theory posits that particles have definite positions and trajectories, which contrasts with the probabilistic outcomes of standard quantum mechanics. Critics argue that the theory's reliance on hidden variables complicates its acceptance within the scientific community. Overall, the debate centers on the validity and implications of different interpretations of quantum mechanics.
  • #61


bhobba said:
By definition a mixed state is an ensemble of states.
No, that's not the definition of a mixed state, as explained even in the Schlosshauer book.

bhobba said:
A mixed state is exactly the same as randomly selecting one of the pure states from this ensemble of states.
No it isn't.

bhobba said:
Every singe book I have ever read on QM, and believe me I have read a few, has defined mixed states that way.
You said you are reading the Schlosshauer book. Then please read Sec. 2.4.4 of it. Here is a quote from that section (bolding is mine):
"In Sect. 2.4.2 above we discussed how the notion of a mixed state is based on
a classical probability concept. Accordingly, one also says that a mixed-state
density matrix (2.20) represents an ignorance-interpretable (proper) mixture
of pure states [47–49],12 in order to express the fact that a mixed-state density
matrix of the form (2.20) can, to some extent, be interpreted as a classical
probability distribution of pure quantum states. However, this is only
true if we actually know that the system has indeed been prepared in one
of the states, but we simply do not possesses more specific information
about which of these states has been prepared. On the other hand, if we are
simply confronted with the density matrix (2.20) but are given no further
information (e.g., about the preparation procedure), we cannot infer that the
system actually is in one of the states.
This is so because any nonpure
density matrix can be written in many different ways, which shows that
any partition into a particular ensemble of quantum states is arbitrary. In
other words, the mixed-state density matrix alone does not suffice to uniquely
reconstruct a classical probability distribution of pure states.
"

You also said that your research area is quantum information. In that case I would recommend you to read the textbook
B. Schumacher, B. Westmoreland: Quantum Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics

bhobba said:
What exactly with the above do you not agree with?
I think it's clear now.
 
Last edited:
Physics news on Phys.org
  • #62
mr. vodka said:
One minor one: you seem to imply Valentini favours interpreting the quantum potential as a causal agent (instead of just a handy mathematical similarity with the Hamilton-Jacobi formalism of classical mechanics).However, I remember reading some passages of his work where he definitely implies the reverse, more in the line of what you seem to call the minimalist Bohmian interpretation.
As I understand him, Valentini tries to thread somewhere in between the minimalist (DGZ) approach and the Bohm/Hiley one. He is critical of both views. For instance he quotes his own paper where he criticizes the DGZ approach of treating ψ as nomological:
Valentini considered the possibility that ψ might merely provide a convenient mathematical summary of the motion q(t); to this end, he drew an analogy between ψ and physical laws such as Maxwell's equations, which also provide a convenient mathematical summary of the behaviour of physical systems. On this view, `the world consists purely of the evolving variables X(t), whose time evolution may be summarised mathematically by ψ' (ibid., p. 13). But Valentini argued further that such a view did not do justice to the physical information stored in ψ, and he concluded instead that ψ was a new kind of causal agent acting in configuration space (a view that the author still takes today). The former view, that ψ is law-like, was adopted by Durr et al. (1997).

De Broglie-Bohm Pilot-Wave Theory: Many Worlds in Denial?
http://www.tcm.phy.cam.ac.uk/~mdt26/local_papers/valentini_2008_denial.pdf

He repeats this theme in this video, where he suggests that configuration space is "real" (like Albert, it seems) and argues that the quantum wave is a new type of "causal" agent that may take some time for us to understand it, in the same way scientists had difficulties accepting the concept of "fields" when they were first introduced. So he sees an evolution (see slides) from forces to fields to this non-local quantum wave (which does not vary with distance and appears to be completely unaffected by matter in between). So in his scheme, the configuration space is always there where the pilot wave (a radically new kind of causal agent that is more abstract than conventional forces or fields in 3-D space) propagates.

The nature of the wave function in de Broglie's pilot-wave theory
http://streamer.perimeterinstitute.ca/Flash/3f521d41-f0a9-4e47-a8c7-e1fd3a4c63c8/viewer.html

At the same time Valentini argues against the quantum potential:
...Bohm's systematic development of the pilot-wave theory in 1952 was presented in the unfortunate guise of a quasimechanical theory with a 'quantum potential'. We propose an abandonment of all such mechanical ideas, and suggest instead that the notion of guiding field be taken as fundamental and irreducible: The rate of change of all variables is given by the gradient or functional derivative of S, with no need for further explanation...[We] suggest that attempts at an explanation in terms of mechanical concepts are more naturally seen as entirely derivative, arising phenomenologically from statistical equilibrium and in particular from the classical limit of equilibrium.
I'm not sure that Bohm's quantum potential is "mechanical", I think, especially when you read his and Hiley's papers. Belousek's paper also questions Valentini's criticism of quantum potential but for slightly different reasons (see p.144). For an interesting read on this Bohmian debate from the Bohm/Hiley perspective:

From the Heisenberg Picture to Bohm: a New Perspective on Active Information and its relation to Shannon Information.
http://www.bbk.ac.uk/tpru/BasilHiley/Vexjo2001W.pdf
 
Last edited by a moderator:
  • #63


Demystifier said:
You also said that your research area is quantum information. In that case I would recommend you to read the textbook
B. Schumacher, B. Westmoreland: Quantum Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

No I didn't - I think you have me confused with someone else - probably DrewD. In fact if I was to list my interests quantum information theory would not even rate a mention.

Demystifier said:
Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics

I have Ballentines book - in fact its my favorite and goto book. It is defined on page 54 but discussed mathematically for a few pages before that. It is true that mixed states do not uniquely determine the pure states they can be decomposed into but my understanding is that is determined by the pointer basis used which, in the case of the measurement problem, is uniquely determined by the possible outcomes of the measurement. The example given a bit later in section 2.4.4 to illustrate the point are mutually exclusive states in that an observational apparatus can not be designed to register both at the same time.

But maybe I am misinterpreting what you are trying to say - have I missed something?

I am about to head of to bed now but before doing that I want to mention Schlosshauer states, correctly, decoherence does not solve the measurement problem. All it does is give the appearance of doing so, as I like to say, solves it for all practical purposes - but in reality it doesn't. Still I think the main issues are resolved.

Thanks
Bill
 
Last edited:
  • #64


Demystifier said:
No, that's not the a state, as explained even in the Schlosshauer book.


No it isn't.


You said you are reading the Schlosshauer book. Then please read Sec. 2.4.4 of it. Here is a quote from that section (bolding is mine):
"In Sect. 2.4.2 above we discussed how the notion of a state is based on
a classical concept. Accordingly, one also says that a -state
density (2.20) represents an ignorance-interpretable (proper) mixture
of [47–49],12 in order to express the fact that a -state density
matrix of the form (2.20) can, to some extent, be interpreted as a classical
probability distribution of pure states. However, this is only
true if we actually know that the system has indeed been prepared
of the states, but we simply do not possesses more specific information
about which of these states has been prepared. On the other hand, if we are
simply confronted with the density matrix (2.20) but are given no further
information (e.g., about the preparation procedure), we cannot infer that the
system actually is in one of the states.
This is so because any nonpure
density matrix can be written in many different ways, which shows that
any partition into a particular of states is arbitrary. In
other words, the mixed-state density matrix alone does not suffice to uniquely
reconstruct a classical probability distribution of pure states.
"

You also said that your research area is information. In that case I would recommend you to read the textbook
B. , B. Westmoreland: Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics


I think it's clear now.

very precise mr. demystifier.
 
  • #65


audioloop said:
very precise mr. demystifier.

Yea - but non uniqueness of a mixed state is not an issue because it is uniquely decomposeable into the pointer basis defined by the observational device. As Schlosshauer states page 112 and 113 the measurement problem has 3 parts:

1. The preferred basis problem (what determines the preferred physical quantities of our experience)
2. The problem of nonobservability of interference (why is it so hard to observe interference effects)
3. The problem of outcomes (why do measurements seem to have outcomes at all and what selects a particular observed outcome)

As we have indicated in this chapter and will discuss further in other places in this book it is reasonable to conclude decoherence is capable of solving the first two problems, whereas the third problem is intrinsically linked to matters of interpretation.

Demystifyer seems to doubt point 1 - but I do not agree nor would it seem does Schlosshauer. The issue is point 3 and I agree entirely - it does not explain why a particular outcome is observed. Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

Added later:

I had in the back of my mind where Schlosshauer had written explicitly decoherence implied the system can be viewed as being in the observed state and I managed to dig up the paper:
http://arxiv.org/pdf/quant-ph/0312059v4.pdf
'The reduced density matrix looks like a mixedstate density matrix because, if one actually measured an observable of the system, one would expect to get a definite outcome with a certain probability; in terms of measurement statistics, this is equivalent to the situation in which the system is in one of the states from the set of possible outcomes from the beginning, that is, before the measurement. As Pessoa (1998, p. 432) puts it, “taking a partial trace amounts to the statistical version of the projection postulate.”'

Thanks
Bill
 
Last edited:
  • #66


bhobba said:
Yea - but non uniqueness of a mixed state is not an issue because it is uniquely decomposeable into the pointer basis defined by the observational device. As Schlosshauer states page 112 and 113 the measurement problem has 3 parts:

1. The preferred basis problem (what determines the preferred physical quantities of our experience)
2. The problem of nonobservability of interference (why is it so hard to observe interference effects)
3. The problem of outcomes (why do measurements seem to have outcomes at all and what selects a particular observed outcome)

As we have indicated in this chapter and will discuss further in other places in this book it is reasonable to conclude decoherence is capable of solving the first two problems, whereas the third problem is intrinsically linked to matters of interpretation.

Demystifyer seems to doubt point 1 - but I do not agree nor would it seem does Schlosshauer. The issue is point 3 and I agree entirely - it does not explain why a particular outcome is observed. Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

Of course I could have missed something or made a mistake and am only too happy for others comments.
I agree that decoherence solves problems 1 and 2.

As you said the real issue is problem 3. But the problem is not only that we don't know which pure state will be selected. The problem is that we don't even know why some pure state will be selected at all.

In my opinion, the problem 3 can be solved only by extending minimal QM with some non-minimal interpretation. For that purpose, many-world interpretation and Bohmian interpretation seem most appropriate. Of course, as you said, these interpretations may "suck" for other reasons, but for solving the problem 3 they are excellent.
 
Last edited:
  • #67


DrewD said:
You are right, I will not start using Bohmian mechanics because my research is in quantum information and nobody uses this interpretation because it provides nothing new (that's actually not true, but there has been no experimental confirmation of the one new prediction that I know of). It might be a nice way to think about things, and I often do imagine quantum particles to be somewhat like the Bohm interpretation, but until there is a reason to follow a less common interpretation that would make reading QM papers more difficult, you are correct.
bhobba said:
No I didn't - I think you have me confused with someone else - probably DrewD. In fact if I was to list my interests quantum information theory would not even rate a mention.
Bhobba you are right, sorry! Anyway, I would recommend the mentioned book even to those who are not particularly interested in applications to quantum information.
 
Last edited:
  • #68


Demystifier said:
I agree that decoherence solves problems 1 and 2. As you said the real issue is problem 3. But the problem is not only that we don't know which pure state will be selected. The problem is that we don't even know why some pure state will be selected at all. In my opinion, the problem 3 can be solved only by extending minimal QM with some non-minimal interpretation. For that purpose, many-world interpretation and Bohmian interpretation seem most appropriate. Of course, as you said, these interpretations may "suck" for other reasons, but for solving the problem 3 they are excellent.

I fully concur. I personally don't think its an issue to worry about but if you do then the interpretations you suggest would indeed seem the most appropriate.

Thanks
Bill
 
  • #69


Demystifier said:
Demystifier said:
Bhobba you are right, sorry! Anyway, I would recommend the mentioned book even to those who are not particularly interested in applications to quantum information.

No problem. And thanks for the reference - but I have so much reading to do its embarrassing - I have been meaning to go a bit deeper into String Theory for a while now.

Thanks
Bill
 
  • #70


bhobba said:
No problem. And thanks for the reference - but I have so much reading to do its embarrassing - I have been meaning to go a bit deeper into String Theory for a while now.
We have quite similar interests. My favored books on string theory are (in the order suitable for pedagogic reading):
- B. Zwiebach, A First Course in String Theory
- R.J. Szabo, http://xxx.lanl.gov/abs/hep-th/0207142
- M. Kaku, Introduction to Superstrings and M-theory
 
  • #71


Demystifier said:
We have quite similar interests. My favored books on string theory are (in the order suitable for pedagogic reading):
- B. Zwiebach, A First Course in String Theory
- R.J. Szabo, http://xxx.lanl.gov/abs/hep-th/0207142
- M. Kaku, Introduction to Superstrings and M-theory

Got those and a few others as well. Went through Zwienbach a while ago but that was only a first reading - really need to a much deeper perusal.

Thanks
Bill
 
  • #72


bhobba said:
Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

This is the central point of your argument and it contains a misconception. There are two distinct uses for the density operator:

First, it is used to capture all information available in a tensor factor space about a quantum state. The result is a description of a single quantum state with a certain information constraint in place. If you assume the measurement postulate for quantum states you can derive a version for density operators from it and it turns out to work fine despite the removed information.

The second use is for describing ensembles of quantum states. However, this is much less obvious than usually suggested. The naïve way to describe an ensemble of states would be a list of states with their associated probabilities. Reducing this to the convex sum of state projectors throws away a lot of information about the actual ensemble and requires a very good reason to do. And this reason is that application of the measurement law constructed for the tensor factor space density operator from above to the density operator built from an ensemble gives the right answer if you consider classical probabilities for the states in the ensemble. This means that the use of a density operator to describe classical ensembles already assumes the measurement postulate and the Born rule.

Now if you identify these two very different objects you cannot possibly derive any properties of quantum measurement, because the Born rule is implicitly used. That leaves decoherence as a theory that explains the lack of macroscopic interference, but does not solve the measurement problem.

In my opinion the fundamental idea of decoherence is still valid though. If we want to solve the measurement problem we should look at how the environment interacts with a quantum system and how our subjective information changes with that interaction. For an approach that goes beyond decoherence please see http://arxiv.org/abs/1205.0293
 
  • #73


It's time to get back to the subject: Why pilot-wave theory is considered controversial?

In my experience, the most frequent reason is that people who criticize pilot-wave theory do not understand interpretation-independent aspects of quantum measurements, such as the role of decoherence. For that purpose, I would suggest to read
Sec. 2: "Essential and inessential aspects of Bohmian interpretation"
of my paper http://xxx.lanl.gov/abs/1112.2034
 
  • #74


Jazzdude said:
This means that the use of a density operator to describe classical ensembles already assumes the measurement postulate and the Born rule.

Can't say I understand your entire post but I do agree with the above and it is a problem. But I believe Gleasons Theorem has something to say about it.

This is getting way off topic so I don't want to really go more into it in this thread. But if you want to start a new thread I will be happy to contribute to it.

Thanks
Bill
 
  • #75


bhobba said:
I fully concur. I personally don't think its an issue to worry about but if you do then the interpretations you suggest would indeed seem the most appropriate.

Thanks
Bill

But how can you not worry about it? As I see it, that's all the measurement problem is, that's the core (sure basis selection is also important but that's A) solved by decoherence and B) frankly IMO less fundamental than the remaining problem).

Let me put it this way: I suppose you can acknowledge that there is a jump from a superposition to a specific eigenstate, and that decoherence can not explain actually getting only one eigenstate, so how can you not find this an important issue in a fundamental theory? You can't just acknowledge a problem and deny it needs a solution.

Indeed, the debate went a bit off-topic I see, and my question is perhaps still off-topic, but it's also on-topic in the following sense: I regard any interpretation that does not solve this problem deeply controversial.
 
  • #76


I'm not sure if anyone had a read through Chris Fuch's interview but as this is related to the topic of the thread he pretty well writes why he dismisses any model that is non-local (why pilot-wave is controversial?):
But if there is indeed a choice, why does QBism hold so desperately to locality while eschewing the idea of predetermined measurement values? The biggest reward of course is that it gives the option to explore “it’s a wonderful life,” but one can give more strictly academic arguments. Einstein, for one, did it very well:
Then he goes on to quote one of Einstein's famous quotes on this topic:
Further, it appears to be essential for this arrangement of the things introduced in physics that, at a specific time, these things claim an existence independent of one another, insofar as these things “lie in different parts of space.” Without such an assumption of the mutually independent existence (the “being-thus”) of spatially distant things, an assumption which originates in everyday thought, physical thought in the sense familiar to us would not be possible. Nor does one see how physical laws could be formulated and tested without such a clean separation...The complete suspension of this basic principle would make impossible the idea of (quasi-)closed systems and, thereby, the establishment of empirically testable laws in the sense familiar to us.
Interview with a Quantum Bayesian
http://lanl.arxiv.org/pdf/1207.2141.pdf

Also it seems his article that "QT needs no interpretation" was titled quite a bit to get attention. It seems his views on the topic are quite the opposite (see p.5):
 
Last edited:
  • #77


Can't the same objection be made about determinism?
 
  • #78


I would have guessed that non-locality is the biggest objection but maybe I'm mistaken? But I'm not sure, because I don't buy Fuch's argument that the issue of determinism/indeterminism in physics has any bearing on the issue of free-will. An interesting paper by Gisin that seems to take your position, I think?
What is surprising is that so many good physicists interpret the violation of Bell’s inequality as an argument against realism. Apparently their hope is to thus save locality, though I have no idea what locality of a non-real world could mean? It might be interesting to remember that no physicist before the advent of relativity interpreted the instantaneous action at a distance of Newton’s gravity as a sign of non-realism (although Newton’s nonlocality is even more radical than quantum nonlocality, as it allowed instantaneous signaling). Hence, it seems that the issue is not nonlocality, but non-determinism. In this note I argued that non-determinism and true randomness are not in contradiction with realism: propensities reflect pre-existing properties, but the reflection is not deterministic. There is thus no conflict between realism and an open future: the future might not (yet) be real, but the process by which the future becomes actual is undoubtedly real.
Is realism compatible with true randomness?
http://lanl.arxiv.org/pdf/1012.2536.pdf
 
  • #79


I agree with what Gisin says until the bold text, from then on I'm not sure what he's talking about. (You sure do whip out those articles quickly, I wonder how you do that.)

Maybe I should clarify my previous post: the argument quoted by you stated that non-locality makes the principle of science, experimentation, impossible. But determinism does this too: the idea of an experiment uses the principle choice for the experimentator, that he can choose what he will measure, which is in contradiction with determinism. (This is actually the problem often stated in relation to superdeterminism, but personally I don't understand the difference between superdeterminism and determinism.)
 
  • #80


mr. vodka said:
But how can you not worry about it? As I see it, that's all the measurement problem is, that's the core (sure basis selection is also important but that's A) solved by decoherence and B) frankly IMO less fundamental than the remaining problem).

Let me put it this way: I suppose you can acknowledge that there is a jump from a superposition to a specific eigenstate, and that decoherence can not explain actually getting only one eigenstate, so how can you not find this an important issue in a fundamental theory? You can't just acknowledge a problem and deny it needs a solution.

Indeed, the debate went a bit off-topic I see, and my question is perhaps still off-topic, but it's also on-topic in the following sense: I regard any interpretation that does not solve this problem deeply controversial.

Mate its simply how nature works. Your problem is you want it to conform to how you want it to work - sorry science is not like that. You can worry about issues like that if you like - but the issue lies within you - not nature. The same with me - the issues I worry about which are along the lines of invariance and symmetry lie within me - nature doesn't care. This leads me to reject interpretations like BM that imply a preferred frame and so breaks symmetry - that's one way to me BM sucks and why I think its a crock. All interpretations suck - its part of your choice of interpretation if it answers the questions you worry about. I personally have zero problem with a scheme that explains collapse but can only predict probabilities of what it collapses into. I don't care if nature is fundamentally probabilistic - it doesn't worry me in the least.

Demystifyer understands this and correctly points out why he likes the interpretations he does but also recognizes it may not gell with others view of the situation.

Thanks
Bill
 
  • #81


No bhobba, you misunderstand me. I'm simply talking about that there seems to be a gap in the way you see things: one moment you have a superposition, the other moment a collapsed state. I'm not talking about understanding this intuitively or something, I'm just saying: something in your theory must account for this. Copenhagen, I suppose, accounts for this by taking collapse to be a separate fundamental axiom, i.e. the wavefunction does not always obey the Schrödinger equation, sometimes it obeys the collapse law. Is that the view you're taking? For now, I haven't seen you taking a stance on the matter, not even "taking collapse as a fundamental separate act". That's all I'm talking about.
 
  • #82


mr. vodka said:
No bhobba, you misunderstand me. I'm simply talking about that there seems to be a gap in the way you see things: one moment you have a superposition, the other moment a collapsed state. I'm not talking about understanding this intuitively or something, I'm just saying: something in your theory must account for this. Copenhagen, I suppose, accounts for this by taking collapse to be a separate fundamental axiom, i.e. the wavefunction does not always obey the Schrödinger equation, sometimes it obeys the collapse law. Is that the view you're taking? For now, I haven't seen you taking a stance on the matter, not even "taking collapse as a fundamental separate act". That's all I'm talking about.

Lets be clear - with decoherence you do not one moment have a superposition and the next another state - its decoheres quickly - but not instantaneously. There are issues with it but that is not one of them.

Theories do not have to account for issues like the wavefunction collapse - they can simply accept it as an axiom. If you choose to worry about it the issue lies within you - not the theory.

The view I have is that Gleasons theorem puts severe limits on the type of theories you can have with the superposition principle and non contextuality. It spells out determinism is not compatible with it and all you can predict is probabilities and those probabilities use the standard density matrix formalism. It does not explain how observations change states - merely that all you can predict is probabilities. Using this as a start you can derive decoherence which explains collapse IMHO. You don't think it explains it because it doesn't tell you why a particular outcome is selected - simply the probability. But Gleasons theorem says you can't do that - it is probabilistic at its core - the only out is non contextuality which I find very hard to swallow - just like you seem to find not being able to predict which state it collapses into hard to swallow.

Thanks
Bill
 
  • #83


You seem to be still misunderstanding me on multiple levels, I'm not sure why since I feel I've literally stated my stance on the matter, but here's one more go:

Theories do not have to account for issues like the wavefunction collapse - they can simply accept it as an axiom. If you choose to worry about it the issue lies within you - not the theory.
Accepting it as an axiom is accounting for it. It wasn't clear to me that was your solution. That's why I asked in my previous post "Copenhagen [...] collapse to be a separate fundamental axiom [...] Is that the view you're taking?". In my view there are still issues with that (mostly: isn't the collapse arbitrary then?) but okay that'd perhaps get off-topic and at least your stance is clear now.

just like you seem to find not being able to predict which state it collapses into hard to swallow.
I'm not having trouble not being able to predict which state it collapses into, as Demystifier also stated, more with the idea that there is fundamental collapse at all, as in there being a mechanism for it. Of course, taking it as a separate axiom eliminates any need for a mechanism, although in my view it does create a problem of "when do I invoke this separate axiom?". But again, would be off-topic I suppose.

Lets be clear - with decoherence you do not one moment have a superposition and the next another state - its decoheres quickly - but not instantaneously.

As you said yourself, earlier, decoherence does not explain the collapse. It explains the basis selection and the non-interference, but not going from a superposition to one of those eigenstates. I thought this was settled after the discussion you and Mystifier had, i.e. that "problem 3" remained unsolved by decoherence.
 
  • #84


bhobba said:
This leads me to reject interpretations like BM that imply a preferred frame and so breaks symmetry - that's one way to me BM sucks and why I think its a crock.
I also don't like a preferred frame, and that's why I developed a variant of BM without preferred frame:
http://xxx.lanl.gov/abs/1002.3226 [Int. J. Quantum Inf. 9 (2011) 367-377]

I would like to see your opinion on that. In particular, why THAT interpretation "sucks"?
 
  • #85


bhobba said:
But Gleasons theorem says you can't do that - it is probabilistic at its core - the only out is non contextuality which I find very hard to swallow
I hope you only missexpressed yourself and not misunderstood it, because the only way out of Gleasons theorem is contextuality, not non-contextuality. Indeed, Bohmian mechanics (BM) is contextual and, in absence of measurements, assigns probabilities that differ from
|psi|^2 (except for probabilities of particle positions, which are always given by |psi|^2 in BM). And yet, when a measurement is performed, BM explains clearly (by using the theory of decoherence) how probabilities evolve into |psi|^2.

Also, I never understood why so many people find so difficult to swallow contextuality. Contextuality merely says that properties change by the act of measurement. Since any measurement involves interaction with the measured system, I see nothing surprising with the fact that measurement changes the system. Indeed, even without adopting any particular interpretation, decoherence explains very clearly how wave function of the system changes by measurement. Decoherence itself is a manifestation of contextuality. Even more, decoherence is the core of contextuality. To me, it makes no sense to accept decoherence without accepting contextuality.
 
  • #86


Demystifier said:
I also don't like a preferred frame, and that's why I developed a variant of BM without preferred frame:
http://xxx.lanl.gov/abs/1002.3226 [Int. J. Quantum Inf. 9 (2011) 367-377]

I would like to see your opinion on that. In particular, why THAT interpretation "sucks"?

Had a quick squiz and personally am not concerned about the FTL stuff - relativity does not really care about FTL - what it cares about is sending information FTL.

But my initial reaction is the quantity that determines the particles acceleration - how do you detect it. It looks exactly the same as BM - inherently unobservable. I accept it may not exist in a real sense but may be the codification of some sub quantum process that does determine it - but the issue remains - how do you detect it? It still looks like an inherently unobservable aether to me. It doesn't seem to break symmetry like the aether does but its unobservability I don't really like. You also have the usual issues such as its contextuality which I dislike (I am symmetry invariance guy), how it accounts for creation and annihilation of particles, what happens to the 'pilot wave' when a particle is created and destroyed, its relation to the spin statistics theorem etc.

Thanks
Bill
 
  • #87


bhobba said:
But my initial reaction is the quantity that determines the particles acceleration - how do you detect it. It looks exactly the same as BM - inherently unobservable. I accept it may not exist in a real sense but may be the codification of some sub quantum process that does determine it - but the issue remains - how do you detect it? It still looks like an inherently unobservable aether to me. It doesn't seem to break symmetry like the aether does but its unobservability I don't really like.
You are right, the unobservability of quantum potential or wave function is not removed. I do not see that as a problem, but still ...

bhobba said:
You also have the usual issues such as its contextuality which I dislike (I am symmetry invariance guy),
First, I would appreciate if you could explain what symmetry (or absence of it) has to do with contextuality. Second, in the post above I have explained how decoherence implies contextuality. Can you explain how could you possibly have both decoherence and non-contextuality?

bhobba said:
how it accounts for creation and annihilation of particles, what happens to the 'pilot wave' when a particle is created and destroyed, its relation to the spin statistics theorem etc.
These problems are solved:
http://xxx.lanl.gov/abs/0904.2287 [Int. J. Mod. Phys. A25:1477-1505, 2010]
 
  • #88


Demystifier said:
First, I would appreciate if you could explain what symmetry (or absence of it) has to do with contextuality. Second, in the post above I have explained how decoherence implies contextuality. Can you explain how could you possibly have both decoherence and non-contextuality?

I am invariance symmetry guy - its concerned with invariance - not symmetry. The issue is non contextuality implies what you are measuring is not influenced by what other stuff you are measuring with it at the same time. Specifically given a resolution of the identity the probability you assign to a projection operator is INVARIENT to the other operators in the resolution. If you assume that then Gleasons Theorem applies and you get the usual density matrix trace formula for probabilities. Decoherence is perfectly in accord with this since it uses the trace formula to derive it. As an aside it may look like Gleasons theorem is magical - its marvelous all right and it really appeals to the mathematician in me - but magical it aren't - it merely shows what a strong assumption the innocent looking non contextuality assumption is. It also implies HV theories like BM must violate it.

This is not to say QM does not have aspects of contextuality since what you measure determines what eigenstate the outcome will be in and that I am pretty sure is the sense you mean it - my sense is at a more fundamental level.

Thanks
Bill
 
Last edited:
  • #89


mr. said:
measurement problem is, that's the core

I regard any interpretation that does not solve this problem deeply.
yet

i agree.

Elegance and Enigma: The Quantum Interviews.
Maximilian Schlosshauer.

Jefrey Bub We don’t really understand the notion of a quantum state, in
particular an entangled quantum state, and the peculiar role of measurement in taking
the description of events from the quantum level, where you have interference
and entanglement, to an effectively classical level where you don’t. In a 1935 article
responding to the EPR argument, Schrödinger characterized entanglement as “the
characteristic trait of quantum mechanics, the one that enforces its entire departure
from classical lines of thought.” I would say that understanding the nonlocality associated
with entangled quantum states, and understanding measurement, in a deep
sense, are still the most pressing problems in the foundations of quantum mechanics
today.

Sheldon Goldstein I think it would be better, however, to respond to the following question: what have been the most pressing problems in the foundations of quantum mechanics?
And to this I suppose the standard answer is the measurement problem, or, more or
less equivalently, Schrödinger’s cat paradox.
If one accepts, however, that the usual quantum-mechanical description of the
state of a quantum system is indeed the complete description of that system, it seems
hard to avoid the conclusion that quantum measurements typically fail to have results.

Daniel Greenberger For reasons I’ll explain in my answer to the Question
(see page 152), I don’t think the measurement problem will be solvable soon, or possibly
Ever. We will probably have to know more about nature for that.

Lucien Hardy the most well-known problem in quantum foundations is the
measurement problem—our basic conception of reality depends on how we resolve
this. the measurement problem is tremendously important.

Anthony Legget To my mind, within the boundaries of “foundations of
quantum mechanics” strictly defined, there is really only one overarching problem: is
quantum mechanics the whole truth about the physical world? that is, will the textbook
application of the formalism—including the use of the measurement axiom.

Tim Maudlin the most pressing problem today is the same as ever it was: to
clearly articulate the exact physical content of all proposed “interpretations” of the
quantum formalism. this is commonly called the measurement problem.

Lee Smolin the measurement problem—that is to say, the fact that there are
two evolution processes, and which one applies depends on whether a measurement
is being made. Related to this is the fact that quantum mechanics does not give us a
description of what happens in an individual experiment.

Antony Valentini the interpretation of quantum mechanics is a wide open
Question… ..It would also be good to see further experiments
searching for wave-function collapse…

David Wallace I think anyone’s answer to this is going to depend above all on
what they think of the quantum measurement problem. After all, the measurement
problem threatens to make quantum mechanics incoherent as a scientific theory—to
reduce it, at best, to a collection of algorithms to predict measurement results. So the
only reason anyone could have not to put the measurement problem right at the top
of the list would be if they think it’s solvable within ordinary quantum mechanics.
(Someone who thinks it’s solvable in some modifed version of quantum mechanics—
in a dynamical-collapse or hidden-variables theory, say—ought to think that
the most pressing problem is generalizing that modified version to account for all of
quantum phenomena, including the phenomena of relativistic feld theory.)
 
Last edited:
  • #90


mr. vodka said:
Maybe I should clarify my previous post: the argument quoted by you stated that non-locality makes the principle of science, experimentation, impossible. But determinism does this too: the idea of an experiment uses the principle choice for the experimentator, that he can choose what he will measure, which is in contradiction with determinism. (This is actually the problem often stated in relation to superdeterminism, but personally I don't understand the difference between superdeterminism and determinism.)
Now that I look over that Einstein quote I'm not sure that Einstein's quote used by Fuchs to question non-locality is accurate. Consider again:
Further, it appears to be essential for this arrangement of the things introduced in physics that, at a specific time, these things claim an existence independent of one another, insofar as these things “lie in different parts of space.” Without such an assumption of the mutually independent existence (the “being-thus”) of spatially distant things, an assumption which originates in everyday thought, physical thought in the sense familiar to us would not be possible. Nor does one see how physical laws could be formulated and tested without such a clean separation...
Is Einstein arguing that non-locality makes experimentation impossible or is it non-separability that is his concern? Not being able to individuate systems spatio-temporally as per Einstein's quote appears to have more to do with non-separability rather than non-locality, I think?
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 12 ·
Replies
12
Views
4K
Replies
11
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 99 ·
4
Replies
99
Views
10K