Why is the pilot-wave theory controversial ? Is it?

  • Thread starter Thread starter nonequilibrium
  • Start date Start date
  • Tags Tags
    Theory
  • #51


bhobba said:
I agree that there is disagreement on how real the potential that determines the position of the particle is is in BM. But I do not agree this issue is not crucial - I believe you really must consider it real or how else does it determine the position of the particle.
First, the quantum potential determines acceleration, not position. But that's not important here.

What is important is to understand some analogies with classical mechanics. Is Hamiltonian H(p,x) real, for if not then how else it determines the motion of the particle? Is Newton potential V(x) real, for if not then how else it determines the motion of the particle? Is the solution S(x,t) of the Hamilton-Jacobi equation real, for if not then how else it determines the motion of the particle?

Well, if all these things in classical mechanics are real, then so is psi(x,t) in BM. In fact, psi(x,t) in BM is the most similar to S(x,t) in classical mechanics.
 
Physics news on Phys.org
  • #52


bhobba said:
I would like to remind you that a mixed state is in a particular pure state prior to observation - we simply do not know which one. Schrodengers cat is definitely alive or dead prior to observation with decoherence.
I think you misunderstood something about decoherence. (Which is surprising for someone who is reading the excellent Schlosshauer's book.)
 
  • #53


Demystifier said:
What is important is to understand some analogies with classical mechanics. Is Hamiltonian H(p,x) real, for if not then how else it determines the motion of the particle? Is Newton potential V(x) real, for if not then how else it determines the motion of the particle? Is the solution S(x,t) of the Hamilton-Jacobi equation real, for if not then how else it determines the motion of the particle?

The Hamiltonian is not real - however from it you can derive a force that is presumably caused by some real agency - after all that is the meaning behind Newtons first law - as it stands it follows from the definition of force which being a definition says nothing. The physics is the assumption the force is caused by something real.

I was a bit confused about this myself at one time where I thought Newtons laws were basically tautological rubbish but I had a long discussion with John Baez about it and he explained what was really going on which is it is basically a prescription that says get to the forces where the forces are actually caused by something REAL. The same with the potential of BM - it codifies something REAL.

Thanks
Bill
 
  • #54


Demystifier said:
I think you misunderstood something about decoherence. (Which is surprising for someone who is reading the excellent Schlosshauer's book.)

I don't think I do - but hey anything is possible - I am not perfect - far from it.

Decoherence results from considering the system, environment and measurement apparatus as a whole and is in a mixed state. Due to decoherence the off diagonal elements of the state and apparatus quickly go to zero by the leaking of the phase to the environment which means the pure states are now eigenstates of the measurement apparatus. An eigenstate is a state that gives that particular outcome - no ifs - no buts. A mixed state is in one of it's pure states 100% for sure - we simply do not know which pure state it is.

If there is any error with the above feel free to correct it.

Thanks
Bill
 
Last edited:
  • #55


bhobba said:
Yea, but its ontological status is different. In most other interpretations (not all but most) of QM its purely a device for calculating probabilities, in BM it is supposed to actually exist out there and have physical effects that guide the actual particle that also exists - is like the aether in LET - and most physicists reject it for the same reason the aether is rejected. Like I say all interpretations suck in their own special way and the existence of the pilot wave that can not be detected is one way BM sucks.

Thanks
Bill

What does it mean to actually exists out there ... in phase space?
 
  • #56


bhobba said:
Decoherence results from considering the system, environment and measurement apparatus as a whole and is in a mixed state.
By "system", I guess you mean the measured subsystem, am I right? Then these three together as whole are in the pure state, not mixed state.

bhobba said:
Due to decoherence the off diagonal elements of the state and apparatus
You probably mean: system and apparatus?

bhobba said:
quickly go to zero by the leaking of the phase to the environment which means the pure states
Pure states of what? Of system? Of apparatus? Of environment? Of everything together?

bhobba said:
are now eigenstates of the measurement apparatus.
I don't even understand what it means. I know what is eigenstate of an operator, but I never heard about an "eigenstate of measurement apparatus".

bhobba said:
An eigenstate is a state that gives that has that particular outcome - no ifs - no buts.
Perhaps by "eigenstate" you mean one of macroscopically distinct states of the apparatus?

bhobba said:
A mixed state is in one of it's pure states
This is an oximoron. A mixed state cannot be in a pure state.

bhobba said:
If there is any error with the above feel free to correct it.
Is the above enough?
 
  • #57


martinbn said:
What does it mean to actually exists out there ... in phase space?

First I am not an expert on BM - I only know the basics. But the the particle is guided by a quantum potential which determines its acceleration. Its reality (or as a codification of something real) follows from Newtons first law that some agency must be responsible for a force acting on a particle. I suppose you can claim its not real and nothing causes the particles acceleration - it simply happens because that's how nature works - but to me that is simply unacceptably weird - but hey QM is pretty weird to begin with - but I don't think that weird.

Thanks
Bill
 
  • #58


Demystifier said:
By "system", I guess you mean the measured subsystem, am I right? Then these three together as whole are in the pure state, not mixed state.

By system I mean the system being measured.

Demystifier said:
Pure states of what? Of system? Of apparatus? Of environment? Of everything together?

The outcome of the observation is the quantum state of the tensor product of the system and apparatus. If you are for example measuring position its is in an eigenstate of position

Demystifier said:
I don't even understand what it means. I know what is eigenstate of an operator, but I never heard about an "eigenstate of measurement apparatus".

Whoa - let's stop here. In every discussion of quantum decoherence I have read (as far as measurements are concerned) the measuring apparatus is itself considered a quantum system and what it indicates is considered a state of the apparatus. For example in Schrodinger's cat the cat (once observed) is considered to be in a quantum state that is either alive or dead. These are the measurement outcomes and are referred to as eigenstates, being the eigenstates of the operator corresponding to the measurement - also called the pointer basis. The tensor product of the system being measured and the apparatus is considered as a state represented by a density operator. The density operator is represented by a matrix using the possible states of the measurement apparatus (after observation ie the pointer basis) as the basis of that representation. Now what decoherence says is the off diagonal elements very quickly goes to zero so that the state of the system being measured is a mixed state of states that definitely give a specific measurement outcome. The interpretation of a mixed state is it is in a specific state - in this case a state that gives a definite measurement outcome - but we only know the probabilities of what state that is.

Do you agree or disagree?

Thanks
Bill
 
Last edited:
  • #59


bhobba said:
By system I mean the system being measured.
Then, as I said, the system, apparatus, and environment together are in a PURE state, not mixed.

bhobba said:
The outcome of the observation is the quantum state of the tensor product of the system and apparatus.
OK, now it's clear.

bhobba said:
Whoa - let's stop here. In every discussion of quantum decoherence I have read (as far as measurements are concerned) the measuring apparatus is itself considered a quantum system and what it indicates is considered a state of the apparatus.
I agree, a state of the apparatus is a well-defined concept. But the EIGEN-state of the apparatus is not.

bhobba said:
For example in Schrodinger's cat the cat (once observed) is considered to be in a quantum state that is either alive or dead. These are the measurement outcomes and are referred to as eigenstates, being the eigenstates of the operator corresponding to the measurement.
That's not wrong, but is not precise enough. In particular, if you determine whether the cat is dead or alive, it is not obvious what is the corresponding OPERATOR which is being measured. And without an operator, the word "eigenstate" does not have a meaning.

bhobba said:
The tensor product of the system being measured and the apparatus is considered as a state represented by a density operator. The density operator is represented by a matrix using the possible states of the measurement apparatus (after observation) as the basis of that representation. Now what decoherence says is the off diagonal elements very quickly goes to zero so that the state of the system being measured is a mixed state of states that definitely give a specific measurement outcome.
That's OK, but ...

bhobba said:
The interpretation of a mixed state is it is in a specific state - in this case a state that gives a definite measurement outcome - but we only know the probabilities of what state that is.
... is not OK. Even Schlosshauer explains that such an interpretation is not appropriate. One reason is because the TOTAL system (measured system, apparatus AND environment together) is not in a mixed state. There are other reasons as well.

bhobba said:
Do you agree or disagree?
I agree that measured system and apparatus are in a mixed state, but as I said, I disagree that this fact alone is sufficient to conclude that "they are in a definite pure state but we only don't know which one".
 
  • #60


Demystifier said:
... is not OK. Even Schlosshauer explains that such an interpretation is not appropriate. One reason is because the TOTAL system (measured system, apparatus AND environment together) is not in a mixed state. There are other reasons as well.

But the measured system and apparatus together are in a mixed state with diagonal elements in the pointer basis and off diagonal elements zero. The total system including environment continues to evolve unitarily - the measured system and apparatus does not - it has become entangled with the environment.

Demystifier said:
I agree that measured system and apparatus are in a mixed state, but as I said, I disagree that this fact alone is sufficient to conclude that "they are in a definite pure state but we only don't know which one".

I am scratching my head about that one. By definition a mixed state is an ensemble of states. A mixed state is exactly the same as randomly selecting one of the pure states from this ensemble of states. This is exactly the same as the system being in one of the pure states and only knowing a probability of which one it is - observationally it is indistinguishable. Every singe book I have ever read on QM, and believe me I have read a few, has defined mixed states that way.

Just to ensure I am not going crazy I did an internet search and easily found a previous thread:
https://www.physicsforums.com/showthread.php?t=260622
'Seriously, a mixed state is an ensemble description. In fact, one of the peculiar things about the interplay between mixed state statistics and quantum statistics is that considering particles in a "mixed state" is indistinguishable from considering them in a randomly drawn pure state if that random drawing gives a statistically equivalent description as the mixed state.'

What exactly with the above do you not agree with?

Thanks
Bill
 
  • #61


bhobba said:
By definition a mixed state is an ensemble of states.
No, that's not the definition of a mixed state, as explained even in the Schlosshauer book.

bhobba said:
A mixed state is exactly the same as randomly selecting one of the pure states from this ensemble of states.
No it isn't.

bhobba said:
Every singe book I have ever read on QM, and believe me I have read a few, has defined mixed states that way.
You said you are reading the Schlosshauer book. Then please read Sec. 2.4.4 of it. Here is a quote from that section (bolding is mine):
"In Sect. 2.4.2 above we discussed how the notion of a mixed state is based on
a classical probability concept. Accordingly, one also says that a mixed-state
density matrix (2.20) represents an ignorance-interpretable (proper) mixture
of pure states [47–49],12 in order to express the fact that a mixed-state density
matrix of the form (2.20) can, to some extent, be interpreted as a classical
probability distribution of pure quantum states. However, this is only
true if we actually know that the system has indeed been prepared in one
of the states, but we simply do not possesses more specific information
about which of these states has been prepared. On the other hand, if we are
simply confronted with the density matrix (2.20) but are given no further
information (e.g., about the preparation procedure), we cannot infer that the
system actually is in one of the states.
This is so because any nonpure
density matrix can be written in many different ways, which shows that
any partition into a particular ensemble of quantum states is arbitrary. In
other words, the mixed-state density matrix alone does not suffice to uniquely
reconstruct a classical probability distribution of pure states.
"

You also said that your research area is quantum information. In that case I would recommend you to read the textbook
B. Schumacher, B. Westmoreland: Quantum Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics

bhobba said:
What exactly with the above do you not agree with?
I think it's clear now.
 
Last edited:
  • #62
mr. vodka said:
One minor one: you seem to imply Valentini favours interpreting the quantum potential as a causal agent (instead of just a handy mathematical similarity with the Hamilton-Jacobi formalism of classical mechanics).However, I remember reading some passages of his work where he definitely implies the reverse, more in the line of what you seem to call the minimalist Bohmian interpretation.
As I understand him, Valentini tries to thread somewhere in between the minimalist (DGZ) approach and the Bohm/Hiley one. He is critical of both views. For instance he quotes his own paper where he criticizes the DGZ approach of treating ψ as nomological:
Valentini considered the possibility that ψ might merely provide a convenient mathematical summary of the motion q(t); to this end, he drew an analogy between ψ and physical laws such as Maxwell's equations, which also provide a convenient mathematical summary of the behaviour of physical systems. On this view, `the world consists purely of the evolving variables X(t), whose time evolution may be summarised mathematically by ψ' (ibid., p. 13). But Valentini argued further that such a view did not do justice to the physical information stored in ψ, and he concluded instead that ψ was a new kind of causal agent acting in configuration space (a view that the author still takes today). The former view, that ψ is law-like, was adopted by Durr et al. (1997).

De Broglie-Bohm Pilot-Wave Theory: Many Worlds in Denial?
http://www.tcm.phy.cam.ac.uk/~mdt26/local_papers/valentini_2008_denial.pdf

He repeats this theme in this video, where he suggests that configuration space is "real" (like Albert, it seems) and argues that the quantum wave is a new type of "causal" agent that may take some time for us to understand it, in the same way scientists had difficulties accepting the concept of "fields" when they were first introduced. So he sees an evolution (see slides) from forces to fields to this non-local quantum wave (which does not vary with distance and appears to be completely unaffected by matter in between). So in his scheme, the configuration space is always there where the pilot wave (a radically new kind of causal agent that is more abstract than conventional forces or fields in 3-D space) propagates.

The nature of the wave function in de Broglie's pilot-wave theory
http://streamer.perimeterinstitute.ca/Flash/3f521d41-f0a9-4e47-a8c7-e1fd3a4c63c8/viewer.html

At the same time Valentini argues against the quantum potential:
...Bohm's systematic development of the pilot-wave theory in 1952 was presented in the unfortunate guise of a quasimechanical theory with a 'quantum potential'. We propose an abandonment of all such mechanical ideas, and suggest instead that the notion of guiding field be taken as fundamental and irreducible: The rate of change of all variables is given by the gradient or functional derivative of S, with no need for further explanation...[We] suggest that attempts at an explanation in terms of mechanical concepts are more naturally seen as entirely derivative, arising phenomenologically from statistical equilibrium and in particular from the classical limit of equilibrium.
I'm not sure that Bohm's quantum potential is "mechanical", I think, especially when you read his and Hiley's papers. Belousek's paper also questions Valentini's criticism of quantum potential but for slightly different reasons (see p.144). For an interesting read on this Bohmian debate from the Bohm/Hiley perspective:

From the Heisenberg Picture to Bohm: a New Perspective on Active Information and its relation to Shannon Information.
http://www.bbk.ac.uk/tpru/BasilHiley/Vexjo2001W.pdf
 
Last edited by a moderator:
  • #63


Demystifier said:
You also said that your research area is quantum information. In that case I would recommend you to read the textbook
B. Schumacher, B. Westmoreland: Quantum Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

No I didn't - I think you have me confused with someone else - probably DrewD. In fact if I was to list my interests quantum information theory would not even rate a mention.

Demystifier said:
Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics

I have Ballentines book - in fact its my favorite and goto book. It is defined on page 54 but discussed mathematically for a few pages before that. It is true that mixed states do not uniquely determine the pure states they can be decomposed into but my understanding is that is determined by the pointer basis used which, in the case of the measurement problem, is uniquely determined by the possible outcomes of the measurement. The example given a bit later in section 2.4.4 to illustrate the point are mutually exclusive states in that an observational apparatus can not be designed to register both at the same time.

But maybe I am misinterpreting what you are trying to say - have I missed something?

I am about to head of to bed now but before doing that I want to mention Schlosshauer states, correctly, decoherence does not solve the measurement problem. All it does is give the appearance of doing so, as I like to say, solves it for all practical purposes - but in reality it doesn't. Still I think the main issues are resolved.

Thanks
Bill
 
Last edited:
  • #64


Demystifier said:
No, that's not the a state, as explained even in the Schlosshauer book.


No it isn't.


You said you are reading the Schlosshauer book. Then please read Sec. 2.4.4 of it. Here is a quote from that section (bolding is mine):
"In Sect. 2.4.2 above we discussed how the notion of a state is based on
a classical concept. Accordingly, one also says that a -state
density (2.20) represents an ignorance-interpretable (proper) mixture
of [47–49],12 in order to express the fact that a -state density
matrix of the form (2.20) can, to some extent, be interpreted as a classical
probability distribution of pure states. However, this is only
true if we actually know that the system has indeed been prepared
of the states, but we simply do not possesses more specific information
about which of these states has been prepared. On the other hand, if we are
simply confronted with the density matrix (2.20) but are given no further
information (e.g., about the preparation procedure), we cannot infer that the
system actually is in one of the states.
This is so because any nonpure
density matrix can be written in many different ways, which shows that
any partition into a particular of states is arbitrary. In
other words, the mixed-state density matrix alone does not suffice to uniquely
reconstruct a classical probability distribution of pure states.
"

You also said that your research area is information. In that case I would recommend you to read the textbook
B. , B. Westmoreland: Processes, Systems and Information
which is an exceptionally good general introduction to QM (including the meaning of mixed states) with emphasis on applications to quantum information.

Other highly recommended books on QM, with CORRECT explanation of mixed states, are:
- L. Ballentine: Quantum Mechanics - A Modern Development
- B. d'Espagnat: Conceptual Foundations of Quantum Mechanics


I think it's clear now.

very precise mr. demystifier.
 
  • #65


audioloop said:
very precise mr. demystifier.

Yea - but non uniqueness of a mixed state is not an issue because it is uniquely decomposeable into the pointer basis defined by the observational device. As Schlosshauer states page 112 and 113 the measurement problem has 3 parts:

1. The preferred basis problem (what determines the preferred physical quantities of our experience)
2. The problem of nonobservability of interference (why is it so hard to observe interference effects)
3. The problem of outcomes (why do measurements seem to have outcomes at all and what selects a particular observed outcome)

As we have indicated in this chapter and will discuss further in other places in this book it is reasonable to conclude decoherence is capable of solving the first two problems, whereas the third problem is intrinsically linked to matters of interpretation.

Demystifyer seems to doubt point 1 - but I do not agree nor would it seem does Schlosshauer. The issue is point 3 and I agree entirely - it does not explain why a particular outcome is observed. Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

Added later:

I had in the back of my mind where Schlosshauer had written explicitly decoherence implied the system can be viewed as being in the observed state and I managed to dig up the paper:
http://arxiv.org/pdf/quant-ph/0312059v4.pdf
'The reduced density matrix looks like a mixedstate density matrix because, if one actually measured an observable of the system, one would expect to get a definite outcome with a certain probability; in terms of measurement statistics, this is equivalent to the situation in which the system is in one of the states from the set of possible outcomes from the beginning, that is, before the measurement. As Pessoa (1998, p. 432) puts it, “taking a partial trace amounts to the statistical version of the projection postulate.”'

Thanks
Bill
 
Last edited:
  • #66


bhobba said:
Yea - but non uniqueness of a mixed state is not an issue because it is uniquely decomposeable into the pointer basis defined by the observational device. As Schlosshauer states page 112 and 113 the measurement problem has 3 parts:

1. The preferred basis problem (what determines the preferred physical quantities of our experience)
2. The problem of nonobservability of interference (why is it so hard to observe interference effects)
3. The problem of outcomes (why do measurements seem to have outcomes at all and what selects a particular observed outcome)

As we have indicated in this chapter and will discuss further in other places in this book it is reasonable to conclude decoherence is capable of solving the first two problems, whereas the third problem is intrinsically linked to matters of interpretation.

Demystifyer seems to doubt point 1 - but I do not agree nor would it seem does Schlosshauer. The issue is point 3 and I agree entirely - it does not explain why a particular outcome is observed. Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

Of course I could have missed something or made a mistake and am only too happy for others comments.
I agree that decoherence solves problems 1 and 2.

As you said the real issue is problem 3. But the problem is not only that we don't know which pure state will be selected. The problem is that we don't even know why some pure state will be selected at all.

In my opinion, the problem 3 can be solved only by extending minimal QM with some non-minimal interpretation. For that purpose, many-world interpretation and Bohmian interpretation seem most appropriate. Of course, as you said, these interpretations may "suck" for other reasons, but for solving the problem 3 they are excellent.
 
Last edited:
  • #67


DrewD said:
You are right, I will not start using Bohmian mechanics because my research is in quantum information and nobody uses this interpretation because it provides nothing new (that's actually not true, but there has been no experimental confirmation of the one new prediction that I know of). It might be a nice way to think about things, and I often do imagine quantum particles to be somewhat like the Bohm interpretation, but until there is a reason to follow a less common interpretation that would make reading QM papers more difficult, you are correct.
bhobba said:
No I didn't - I think you have me confused with someone else - probably DrewD. In fact if I was to list my interests quantum information theory would not even rate a mention.
Bhobba you are right, sorry! Anyway, I would recommend the mentioned book even to those who are not particularly interested in applications to quantum information.
 
Last edited:
  • #68


Demystifier said:
I agree that decoherence solves problems 1 and 2. As you said the real issue is problem 3. But the problem is not only that we don't know which pure state will be selected. The problem is that we don't even know why some pure state will be selected at all. In my opinion, the problem 3 can be solved only by extending minimal QM with some non-minimal interpretation. For that purpose, many-world interpretation and Bohmian interpretation seem most appropriate. Of course, as you said, these interpretations may "suck" for other reasons, but for solving the problem 3 they are excellent.

I fully concur. I personally don't think its an issue to worry about but if you do then the interpretations you suggest would indeed seem the most appropriate.

Thanks
Bill
 
  • #69


Demystifier said:
Demystifier said:
Bhobba you are right, sorry! Anyway, I would recommend the mentioned book even to those who are not particularly interested in applications to quantum information.

No problem. And thanks for the reference - but I have so much reading to do its embarrassing - I have been meaning to go a bit deeper into String Theory for a while now.

Thanks
Bill
 
  • #70


bhobba said:
No problem. And thanks for the reference - but I have so much reading to do its embarrassing - I have been meaning to go a bit deeper into String Theory for a while now.
We have quite similar interests. My favored books on string theory are (in the order suitable for pedagogic reading):
- B. Zwiebach, A First Course in String Theory
- R.J. Szabo, http://xxx.lanl.gov/abs/hep-th/0207142
- M. Kaku, Introduction to Superstrings and M-theory
 
  • #71


Demystifier said:
We have quite similar interests. My favored books on string theory are (in the order suitable for pedagogic reading):
- B. Zwiebach, A First Course in String Theory
- R.J. Szabo, http://xxx.lanl.gov/abs/hep-th/0207142
- M. Kaku, Introduction to Superstrings and M-theory

Got those and a few others as well. Went through Zwienbach a while ago but that was only a first reading - really need to a much deeper perusal.

Thanks
Bill
 
  • #72


bhobba said:
Being dechohered into a mixed state you cannot predict which pure state will be selected but you know one will be, you can assume it was part of an ensemble, and in that state prior to observation.

This is the central point of your argument and it contains a misconception. There are two distinct uses for the density operator:

First, it is used to capture all information available in a tensor factor space about a quantum state. The result is a description of a single quantum state with a certain information constraint in place. If you assume the measurement postulate for quantum states you can derive a version for density operators from it and it turns out to work fine despite the removed information.

The second use is for describing ensembles of quantum states. However, this is much less obvious than usually suggested. The naïve way to describe an ensemble of states would be a list of states with their associated probabilities. Reducing this to the convex sum of state projectors throws away a lot of information about the actual ensemble and requires a very good reason to do. And this reason is that application of the measurement law constructed for the tensor factor space density operator from above to the density operator built from an ensemble gives the right answer if you consider classical probabilities for the states in the ensemble. This means that the use of a density operator to describe classical ensembles already assumes the measurement postulate and the Born rule.

Now if you identify these two very different objects you cannot possibly derive any properties of quantum measurement, because the Born rule is implicitly used. That leaves decoherence as a theory that explains the lack of macroscopic interference, but does not solve the measurement problem.

In my opinion the fundamental idea of decoherence is still valid though. If we want to solve the measurement problem we should look at how the environment interacts with a quantum system and how our subjective information changes with that interaction. For an approach that goes beyond decoherence please see http://arxiv.org/abs/1205.0293
 
  • #73


It's time to get back to the subject: Why pilot-wave theory is considered controversial?

In my experience, the most frequent reason is that people who criticize pilot-wave theory do not understand interpretation-independent aspects of quantum measurements, such as the role of decoherence. For that purpose, I would suggest to read
Sec. 2: "Essential and inessential aspects of Bohmian interpretation"
of my paper http://xxx.lanl.gov/abs/1112.2034
 
  • #74


Jazzdude said:
This means that the use of a density operator to describe classical ensembles already assumes the measurement postulate and the Born rule.

Can't say I understand your entire post but I do agree with the above and it is a problem. But I believe Gleasons Theorem has something to say about it.

This is getting way off topic so I don't want to really go more into it in this thread. But if you want to start a new thread I will be happy to contribute to it.

Thanks
Bill
 
  • #75


bhobba said:
I fully concur. I personally don't think its an issue to worry about but if you do then the interpretations you suggest would indeed seem the most appropriate.

Thanks
Bill

But how can you not worry about it? As I see it, that's all the measurement problem is, that's the core (sure basis selection is also important but that's A) solved by decoherence and B) frankly IMO less fundamental than the remaining problem).

Let me put it this way: I suppose you can acknowledge that there is a jump from a superposition to a specific eigenstate, and that decoherence can not explain actually getting only one eigenstate, so how can you not find this an important issue in a fundamental theory? You can't just acknowledge a problem and deny it needs a solution.

Indeed, the debate went a bit off-topic I see, and my question is perhaps still off-topic, but it's also on-topic in the following sense: I regard any interpretation that does not solve this problem deeply controversial.
 
  • #76


I'm not sure if anyone had a read through Chris Fuch's interview but as this is related to the topic of the thread he pretty well writes why he dismisses any model that is non-local (why pilot-wave is controversial?):
But if there is indeed a choice, why does QBism hold so desperately to locality while eschewing the idea of predetermined measurement values? The biggest reward of course is that it gives the option to explore “it’s a wonderful life,” but one can give more strictly academic arguments. Einstein, for one, did it very well:
Then he goes on to quote one of Einstein's famous quotes on this topic:
Further, it appears to be essential for this arrangement of the things introduced in physics that, at a specific time, these things claim an existence independent of one another, insofar as these things “lie in different parts of space.” Without such an assumption of the mutually independent existence (the “being-thus”) of spatially distant things, an assumption which originates in everyday thought, physical thought in the sense familiar to us would not be possible. Nor does one see how physical laws could be formulated and tested without such a clean separation...The complete suspension of this basic principle would make impossible the idea of (quasi-)closed systems and, thereby, the establishment of empirically testable laws in the sense familiar to us.
Interview with a Quantum Bayesian
http://lanl.arxiv.org/pdf/1207.2141.pdf

Also it seems his article that "QT needs no interpretation" was titled quite a bit to get attention. It seems his views on the topic are quite the opposite (see p.5):
 
Last edited:
  • #77


Can't the same objection be made about determinism?
 
  • #78


I would have guessed that non-locality is the biggest objection but maybe I'm mistaken? But I'm not sure, because I don't buy Fuch's argument that the issue of determinism/indeterminism in physics has any bearing on the issue of free-will. An interesting paper by Gisin that seems to take your position, I think?
What is surprising is that so many good physicists interpret the violation of Bell’s inequality as an argument against realism. Apparently their hope is to thus save locality, though I have no idea what locality of a non-real world could mean? It might be interesting to remember that no physicist before the advent of relativity interpreted the instantaneous action at a distance of Newton’s gravity as a sign of non-realism (although Newton’s nonlocality is even more radical than quantum nonlocality, as it allowed instantaneous signaling). Hence, it seems that the issue is not nonlocality, but non-determinism. In this note I argued that non-determinism and true randomness are not in contradiction with realism: propensities reflect pre-existing properties, but the reflection is not deterministic. There is thus no conflict between realism and an open future: the future might not (yet) be real, but the process by which the future becomes actual is undoubtedly real.
Is realism compatible with true randomness?
http://lanl.arxiv.org/pdf/1012.2536.pdf
 
  • #79


I agree with what Gisin says until the bold text, from then on I'm not sure what he's talking about. (You sure do whip out those articles quickly, I wonder how you do that.)

Maybe I should clarify my previous post: the argument quoted by you stated that non-locality makes the principle of science, experimentation, impossible. But determinism does this too: the idea of an experiment uses the principle choice for the experimentator, that he can choose what he will measure, which is in contradiction with determinism. (This is actually the problem often stated in relation to superdeterminism, but personally I don't understand the difference between superdeterminism and determinism.)
 
  • #80


mr. vodka said:
But how can you not worry about it? As I see it, that's all the measurement problem is, that's the core (sure basis selection is also important but that's A) solved by decoherence and B) frankly IMO less fundamental than the remaining problem).

Let me put it this way: I suppose you can acknowledge that there is a jump from a superposition to a specific eigenstate, and that decoherence can not explain actually getting only one eigenstate, so how can you not find this an important issue in a fundamental theory? You can't just acknowledge a problem and deny it needs a solution.

Indeed, the debate went a bit off-topic I see, and my question is perhaps still off-topic, but it's also on-topic in the following sense: I regard any interpretation that does not solve this problem deeply controversial.

Mate its simply how nature works. Your problem is you want it to conform to how you want it to work - sorry science is not like that. You can worry about issues like that if you like - but the issue lies within you - not nature. The same with me - the issues I worry about which are along the lines of invariance and symmetry lie within me - nature doesn't care. This leads me to reject interpretations like BM that imply a preferred frame and so breaks symmetry - that's one way to me BM sucks and why I think its a crock. All interpretations suck - its part of your choice of interpretation if it answers the questions you worry about. I personally have zero problem with a scheme that explains collapse but can only predict probabilities of what it collapses into. I don't care if nature is fundamentally probabilistic - it doesn't worry me in the least.

Demystifyer understands this and correctly points out why he likes the interpretations he does but also recognizes it may not gell with others view of the situation.

Thanks
Bill
 
  • #81


No bhobba, you misunderstand me. I'm simply talking about that there seems to be a gap in the way you see things: one moment you have a superposition, the other moment a collapsed state. I'm not talking about understanding this intuitively or something, I'm just saying: something in your theory must account for this. Copenhagen, I suppose, accounts for this by taking collapse to be a separate fundamental axiom, i.e. the wavefunction does not always obey the Schrödinger equation, sometimes it obeys the collapse law. Is that the view you're taking? For now, I haven't seen you taking a stance on the matter, not even "taking collapse as a fundamental separate act". That's all I'm talking about.
 
  • #82


mr. vodka said:
No bhobba, you misunderstand me. I'm simply talking about that there seems to be a gap in the way you see things: one moment you have a superposition, the other moment a collapsed state. I'm not talking about understanding this intuitively or something, I'm just saying: something in your theory must account for this. Copenhagen, I suppose, accounts for this by taking collapse to be a separate fundamental axiom, i.e. the wavefunction does not always obey the Schrödinger equation, sometimes it obeys the collapse law. Is that the view you're taking? For now, I haven't seen you taking a stance on the matter, not even "taking collapse as a fundamental separate act". That's all I'm talking about.

Lets be clear - with decoherence you do not one moment have a superposition and the next another state - its decoheres quickly - but not instantaneously. There are issues with it but that is not one of them.

Theories do not have to account for issues like the wavefunction collapse - they can simply accept it as an axiom. If you choose to worry about it the issue lies within you - not the theory.

The view I have is that Gleasons theorem puts severe limits on the type of theories you can have with the superposition principle and non contextuality. It spells out determinism is not compatible with it and all you can predict is probabilities and those probabilities use the standard density matrix formalism. It does not explain how observations change states - merely that all you can predict is probabilities. Using this as a start you can derive decoherence which explains collapse IMHO. You don't think it explains it because it doesn't tell you why a particular outcome is selected - simply the probability. But Gleasons theorem says you can't do that - it is probabilistic at its core - the only out is non contextuality which I find very hard to swallow - just like you seem to find not being able to predict which state it collapses into hard to swallow.

Thanks
Bill
 
  • #83


You seem to be still misunderstanding me on multiple levels, I'm not sure why since I feel I've literally stated my stance on the matter, but here's one more go:

Theories do not have to account for issues like the wavefunction collapse - they can simply accept it as an axiom. If you choose to worry about it the issue lies within you - not the theory.
Accepting it as an axiom is accounting for it. It wasn't clear to me that was your solution. That's why I asked in my previous post "Copenhagen [...] collapse to be a separate fundamental axiom [...] Is that the view you're taking?". In my view there are still issues with that (mostly: isn't the collapse arbitrary then?) but okay that'd perhaps get off-topic and at least your stance is clear now.

just like you seem to find not being able to predict which state it collapses into hard to swallow.
I'm not having trouble not being able to predict which state it collapses into, as Demystifier also stated, more with the idea that there is fundamental collapse at all, as in there being a mechanism for it. Of course, taking it as a separate axiom eliminates any need for a mechanism, although in my view it does create a problem of "when do I invoke this separate axiom?". But again, would be off-topic I suppose.

Lets be clear - with decoherence you do not one moment have a superposition and the next another state - its decoheres quickly - but not instantaneously.

As you said yourself, earlier, decoherence does not explain the collapse. It explains the basis selection and the non-interference, but not going from a superposition to one of those eigenstates. I thought this was settled after the discussion you and Mystifier had, i.e. that "problem 3" remained unsolved by decoherence.
 
  • #84


bhobba said:
This leads me to reject interpretations like BM that imply a preferred frame and so breaks symmetry - that's one way to me BM sucks and why I think its a crock.
I also don't like a preferred frame, and that's why I developed a variant of BM without preferred frame:
http://xxx.lanl.gov/abs/1002.3226 [Int. J. Quantum Inf. 9 (2011) 367-377]

I would like to see your opinion on that. In particular, why THAT interpretation "sucks"?
 
  • #85


bhobba said:
But Gleasons theorem says you can't do that - it is probabilistic at its core - the only out is non contextuality which I find very hard to swallow
I hope you only missexpressed yourself and not misunderstood it, because the only way out of Gleasons theorem is contextuality, not non-contextuality. Indeed, Bohmian mechanics (BM) is contextual and, in absence of measurements, assigns probabilities that differ from
|psi|^2 (except for probabilities of particle positions, which are always given by |psi|^2 in BM). And yet, when a measurement is performed, BM explains clearly (by using the theory of decoherence) how probabilities evolve into |psi|^2.

Also, I never understood why so many people find so difficult to swallow contextuality. Contextuality merely says that properties change by the act of measurement. Since any measurement involves interaction with the measured system, I see nothing surprising with the fact that measurement changes the system. Indeed, even without adopting any particular interpretation, decoherence explains very clearly how wave function of the system changes by measurement. Decoherence itself is a manifestation of contextuality. Even more, decoherence is the core of contextuality. To me, it makes no sense to accept decoherence without accepting contextuality.
 
  • #86


Demystifier said:
I also don't like a preferred frame, and that's why I developed a variant of BM without preferred frame:
http://xxx.lanl.gov/abs/1002.3226 [Int. J. Quantum Inf. 9 (2011) 367-377]

I would like to see your opinion on that. In particular, why THAT interpretation "sucks"?

Had a quick squiz and personally am not concerned about the FTL stuff - relativity does not really care about FTL - what it cares about is sending information FTL.

But my initial reaction is the quantity that determines the particles acceleration - how do you detect it. It looks exactly the same as BM - inherently unobservable. I accept it may not exist in a real sense but may be the codification of some sub quantum process that does determine it - but the issue remains - how do you detect it? It still looks like an inherently unobservable aether to me. It doesn't seem to break symmetry like the aether does but its unobservability I don't really like. You also have the usual issues such as its contextuality which I dislike (I am symmetry invariance guy), how it accounts for creation and annihilation of particles, what happens to the 'pilot wave' when a particle is created and destroyed, its relation to the spin statistics theorem etc.

Thanks
Bill
 
  • #87


bhobba said:
But my initial reaction is the quantity that determines the particles acceleration - how do you detect it. It looks exactly the same as BM - inherently unobservable. I accept it may not exist in a real sense but may be the codification of some sub quantum process that does determine it - but the issue remains - how do you detect it? It still looks like an inherently unobservable aether to me. It doesn't seem to break symmetry like the aether does but its unobservability I don't really like.
You are right, the unobservability of quantum potential or wave function is not removed. I do not see that as a problem, but still ...

bhobba said:
You also have the usual issues such as its contextuality which I dislike (I am symmetry invariance guy),
First, I would appreciate if you could explain what symmetry (or absence of it) has to do with contextuality. Second, in the post above I have explained how decoherence implies contextuality. Can you explain how could you possibly have both decoherence and non-contextuality?

bhobba said:
how it accounts for creation and annihilation of particles, what happens to the 'pilot wave' when a particle is created and destroyed, its relation to the spin statistics theorem etc.
These problems are solved:
http://xxx.lanl.gov/abs/0904.2287 [Int. J. Mod. Phys. A25:1477-1505, 2010]
 
  • #88


Demystifier said:
First, I would appreciate if you could explain what symmetry (or absence of it) has to do with contextuality. Second, in the post above I have explained how decoherence implies contextuality. Can you explain how could you possibly have both decoherence and non-contextuality?

I am invariance symmetry guy - its concerned with invariance - not symmetry. The issue is non contextuality implies what you are measuring is not influenced by what other stuff you are measuring with it at the same time. Specifically given a resolution of the identity the probability you assign to a projection operator is INVARIENT to the other operators in the resolution. If you assume that then Gleasons Theorem applies and you get the usual density matrix trace formula for probabilities. Decoherence is perfectly in accord with this since it uses the trace formula to derive it. As an aside it may look like Gleasons theorem is magical - its marvelous all right and it really appeals to the mathematician in me - but magical it aren't - it merely shows what a strong assumption the innocent looking non contextuality assumption is. It also implies HV theories like BM must violate it.

This is not to say QM does not have aspects of contextuality since what you measure determines what eigenstate the outcome will be in and that I am pretty sure is the sense you mean it - my sense is at a more fundamental level.

Thanks
Bill
 
Last edited:
  • #89


mr. said:
measurement problem is, that's the core

I regard any interpretation that does not solve this problem deeply.
yet

i agree.

Elegance and Enigma: The Quantum Interviews.
Maximilian Schlosshauer.

Jefrey Bub We don’t really understand the notion of a quantum state, in
particular an entangled quantum state, and the peculiar role of measurement in taking
the description of events from the quantum level, where you have interference
and entanglement, to an effectively classical level where you don’t. In a 1935 article
responding to the EPR argument, Schrödinger characterized entanglement as “the
characteristic trait of quantum mechanics, the one that enforces its entire departure
from classical lines of thought.” I would say that understanding the nonlocality associated
with entangled quantum states, and understanding measurement, in a deep
sense, are still the most pressing problems in the foundations of quantum mechanics
today.

Sheldon Goldstein I think it would be better, however, to respond to the following question: what have been the most pressing problems in the foundations of quantum mechanics?
And to this I suppose the standard answer is the measurement problem, or, more or
less equivalently, Schrödinger’s cat paradox.
If one accepts, however, that the usual quantum-mechanical description of the
state of a quantum system is indeed the complete description of that system, it seems
hard to avoid the conclusion that quantum measurements typically fail to have results.

Daniel Greenberger For reasons I’ll explain in my answer to the Question
(see page 152), I don’t think the measurement problem will be solvable soon, or possibly
Ever. We will probably have to know more about nature for that.

Lucien Hardy the most well-known problem in quantum foundations is the
measurement problem—our basic conception of reality depends on how we resolve
this. the measurement problem is tremendously important.

Anthony Legget To my mind, within the boundaries of “foundations of
quantum mechanics” strictly defined, there is really only one overarching problem: is
quantum mechanics the whole truth about the physical world? that is, will the textbook
application of the formalism—including the use of the measurement axiom.

Tim Maudlin the most pressing problem today is the same as ever it was: to
clearly articulate the exact physical content of all proposed “interpretations” of the
quantum formalism. this is commonly called the measurement problem.

Lee Smolin the measurement problem—that is to say, the fact that there are
two evolution processes, and which one applies depends on whether a measurement
is being made. Related to this is the fact that quantum mechanics does not give us a
description of what happens in an individual experiment.

Antony Valentini the interpretation of quantum mechanics is a wide open
Question… ..It would also be good to see further experiments
searching for wave-function collapse…

David Wallace I think anyone’s answer to this is going to depend above all on
what they think of the quantum measurement problem. After all, the measurement
problem threatens to make quantum mechanics incoherent as a scientific theory—to
reduce it, at best, to a collection of algorithms to predict measurement results. So the
only reason anyone could have not to put the measurement problem right at the top
of the list would be if they think it’s solvable within ordinary quantum mechanics.
(Someone who thinks it’s solvable in some modifed version of quantum mechanics—
in a dynamical-collapse or hidden-variables theory, say—ought to think that
the most pressing problem is generalizing that modified version to account for all of
quantum phenomena, including the phenomena of relativistic feld theory.)
 
Last edited:
  • #90


mr. vodka said:
Maybe I should clarify my previous post: the argument quoted by you stated that non-locality makes the principle of science, experimentation, impossible. But determinism does this too: the idea of an experiment uses the principle choice for the experimentator, that he can choose what he will measure, which is in contradiction with determinism. (This is actually the problem often stated in relation to superdeterminism, but personally I don't understand the difference between superdeterminism and determinism.)
Now that I look over that Einstein quote I'm not sure that Einstein's quote used by Fuchs to question non-locality is accurate. Consider again:
Further, it appears to be essential for this arrangement of the things introduced in physics that, at a specific time, these things claim an existence independent of one another, insofar as these things “lie in different parts of space.” Without such an assumption of the mutually independent existence (the “being-thus”) of spatially distant things, an assumption which originates in everyday thought, physical thought in the sense familiar to us would not be possible. Nor does one see how physical laws could be formulated and tested without such a clean separation...
Is Einstein arguing that non-locality makes experimentation impossible or is it non-separability that is his concern? Not being able to individuate systems spatio-temporally as per Einstein's quote appears to have more to do with non-separability rather than non-locality, I think?
 
  • #91


bohm2 said:
Is realism compatible with true randomness?
http://lanl.arxiv.org/pdf/1012.2536.pdf

i concur.
so to have properties, you need objects, cos if you talk about outcomes you need them, there are no properties without objects.
and by the way probabilities are just epistemic.
 
Last edited:
  • #92


bohm2 said:
Now that I look over that Einstein quote I'm not sure that Einstein's quote used by Fuchs to question non-locality is accurate. Consider again:

Is Einstein arguing that non-locality makes experimentation impossible or is it non-separability that is his concern? Not being able to individuate systems spatio-temporally as per Einstein's quote appears to have more to do with non-separability rather than non-locality, I think?

I appreciate the distinction you're trying to draw, but I'm having trouble sorting out what the essential difference would be between (non-)separability and (non-)locality. Let me give it a try: in the case of nonlocality (but separability), things can *be* separate entities, but not *act* like separate entities. But what about the old adagium: if it acts, tastes and sounds like X, it is X. Maybe it's more fertile to figure out what the key argument is Einstein is making, as opposed to what case/object/concept he is applying that argument to. Intuitively, I had understood it as the idea that if an experimentator builds a machine and sets up and experiment, he must be able to do so independently of the state the system is in. (And if that is indeed the core idea, then I suppose it's both applicable to non-separability and non-locality, whatever the difference may be). What do you think?

EDIT: and if that is the core issue, it would also be in confrontation with determinism (?), hence my comment.
 
  • #93


bohm2 said:
Now that I look over that Einstein quote I'm not sure that Einstein's quote used by Fuchs to question -locality is accurate. Consider again:

Is Einstein arguing that non-locality makes experimentation impossible or is it non-separability that is his concern? Not being able to individuate systems spatio-temporally as per Einstein's quote appears to have more to do with non-separability rather than non-locality, I think?

http://arxiv.org/pdf/quant-ph/0205039v1.pdf

"What relation is there between the “state” (“quantum state”) described by a function ψ and a real deterministic situation (that we call the “real state”)? Does the quantum state characterize completely (1) or only incompletely (2) a real state?
One cannot respond unambiguously to this question, because each measurement represents a real uncontrollable intervention in the system (Heisenberg). The real state is not therefore something that is immediately accessible to experience, and its appreciation always rests hypothetical. (Comparable to the notion of force in classical mechanics, if one doesn’t fix a priori the law of motion.) Therefore suppositions (1) and (2) are, in principle, both possible. A decision in favor of one of them can be taken only after an examination and confrontation of the admissibility of their consequences.
I reject (1) because it obliges us to admit that there is a rigid connection between parts of the system separated from each other in space in an arbitrary way (instantaneous action at a distance, which doesn’t diminish when the distance increases). Here is the demonstration: A system S12, with a function ψ12, which is known, is composed of two systems S1, and S2, which are very far from each other at the instant t. If one makes a “complete” measurement on S1, which can be done in different ways (according to whether one measures, for example, the momenta or the coordinates), depending on the result of the measurement and the function ψ12, one can determine by current quantum-theoretical methods, the function ψ2 of the second system. This function can assume different forms, according to the procedure of measurement applied to S1.
But this is in contradiction with (1) if one excludes action at a distance. Therefore the measurement on S1 has no effect on the real state S2, and therefore assuming (1) no effect on the quantum state of S2 described by ψ2. I am thus forced to pass to the supposition (2) according to which the real state of a system is only described incompletely by the function ψ12.
If one considers the method of the present quantum theory as being in principle definitive, that amounts to renouncing a complete description of real states. One could justify this renunciation if one assumes that there is no law for real states—i.e., that their description would be useless. Otherwise said, that would mean: laws don’t apply to things, but only to what observation teaches us about them. (The laws that relate to the temporal succession of this partial knowledge are however entirely deterministic.)
Now, I can’t accept that. I think that the statistical character of the present theory is simply conditioned by the choice of an incomplete description."
 
  • #94


mr. vodka said:
I (And if that is indeed the core idea, then I suppose it's both applicable to non-separability and non-locality, whatever the difference may be). What do you think?
I think you're right. It seems that both non-separability and non-locality may have bothered Einstein. I have come across stuff suggesting that Einstein's argument for the incompleteness of QM was based on both separability and locality principles (although in that quote, it's not very clear). The definitions I've come across seem to vary depending on author source. Here's one:
Separability Principle: Spatiotemporally separated systems possesses their own separate, individual real physical states, of such a kind that the composite state of a joint system is wholly determined by the separate states of the component systems.

Locality Principle: The real physical state of a system in one region of spacetime cannot be influenced by events in a region of spacetime separated from the first by a spacelike interval. (No action at a distance.)
I'm guessing non-locality would be far more controversial because QM is considered non-separable. I've seen papers that even sub-categorize different degrees of non-locality (weak versus strong). I have no idea what they mean. In one paper the author writes:
Second, concerning the metaphysical implications of quantum non-locality it has been argued that while parameter dependence2 requires a causal relation (action at-a-distance), outcome dependence is best understood as a non-causal connection (non-separability / holism). Since one cannot take refuge in outcome dependence any more: does that mean that we necessarily have to accept action at-a-distance? If yes, between which variables? Or can the idea of a nonseparability be made intelligible even for parameter dependent theories?
A stronger Bell argument for quantum non-locality
http://philsci-archive.pitt.edu/906...er_Bell_argument_for_quantum_non-locality.pdf
 
  • #95


audioloop said:
yet

i agree.

Elegance and Enigma: The Quantum Interviews.
Maximilian Schlosshauer.

Jefrey Bub We don’t really understand the notion of a quantum state, in
particular an entangled quantum state, and the peculiar role of measurement in taking
the description of events from the quantum level, where you have interference
and entanglement, to an effectively classical level where you don’t. In a 1935 article
responding to the EPR argument, Schrödinger characterized entanglement as “the
characteristic trait of quantum mechanics, the one that enforces its entire departure
from classical lines of thought.” I would say that understanding the nonlocality associated
with entangled quantum states, and understanding measurement, in a deep
sense, are still the most pressing problems in the foundations of quantum mechanics
today.

Sheldon Goldstein I think it would be better, however, to respond to the following question: what have been the most pressing problems in the foundations of quantum mechanics?
And to this I suppose the standard answer is the measurement problem, or, more or
less equivalently, Schrödinger’s cat paradox.
If one accepts, however, that the usual quantum-mechanical description of the
state of a quantum system is indeed the complete description of that system, it seems
hard to avoid the conclusion that quantum measurements typically fail to have results.

Daniel Greenberger For reasons I’ll explain in my answer to the Question
(see page 152), I don’t think the measurement problem will be solvable soon, or possibly
Ever. We will probably have to know more about nature for that.

Lucien Hardy the most well-known problem in quantum foundations is the
measurement problem—our basic conception of reality depends on how we resolve
this. the measurement problem is tremendously important.

Anthony Legget To my mind, within the boundaries of “foundations of
quantum mechanics” strictly defined, there is really only one overarching problem: is
quantum mechanics the whole truth about the physical world? that is, will the textbook
application of the formalism—including the use of the measurement axiom.

Tim Maudlin the most pressing problem today is the same as ever it was: to
clearly articulate the exact physical content of all proposed “interpretations” of the
quantum formalism. this is commonly called the measurement problem.

Lee Smolin the measurement problem—that is to say, the fact that there are
two evolution processes, and which one applies depends on whether a measurement
is being made. Related to this is the fact that quantum mechanics does not give us a
description of what happens in an individual experiment.

Antony Valentini the interpretation of quantum mechanics is a wide open
Question… ..It would also be good to see further experiments
searching for wave-function collapse…

David Wallace I think anyone’s answer to this is going to depend above all on
what they think of the quantum measurement problem. After all, the measurement
problem threatens to make quantum mechanics incoherent as a scientific theory—to
reduce it, at best, to a collection of algorithms to predict measurement results. So the
only reason anyone could have not to put the measurement problem right at the top
of the list would be if they think it’s solvable within ordinary quantum mechanics.
(Someone who thinks it’s solvable in some modifed version of quantum mechanics—
in a dynamical-collapse or hidden-variables theory, say—ought to think that
the most pressing problem is generalizing that modified version to account for all of
quantum phenomena, including the phenomena of relativistic feld theory.)
They all more or less agree what the main unsolved problem of QM is.
But note that those respectable physicists are not chosen randomly. They all have something in common - they are all doing research in quantum foundations.

On the other hand, physicists doing research in quantum applications (rather than foundations), which is actually what most quantum physicists do, typically do not see the measurement problem as a serious problem.

And that gives one of the most frequent answers to the "why BM is considered controversial" question. Most physicists do not see the use of BM, and for them it's often a sufficient reason to consider it "controversial". That's why it is important to stress the existence of the book
https://www.amazon.com/dp/9814316393/?tag=pfamazon01-20
 
Last edited:
  • #96


Demystifier said:
They all more or less agree what the main unsolved problem of QM is. But note that those respectable physicists are not chosen randomly. They all have something in common - they are all doing research in quantum foundations.

On the other hand, physicists doing research in quantum applications (rather than foundations), which is actually what most quantum physicists do, typically do not see the measurement problem as a serious problem.

And that gives one of the most frequent answers to the "why BM is considered controversial" question. Most physicists do not see the use of BM, and for them it's often a sufficient reason to consider it "controversial". That's why it is important to stress the existence of the book
https://www.amazon.com/dp/9814316393/?tag=pfamazon01-20

Abso-friggen-lutely.

I have often seen it mentioned BM has much more traction as the preferred or one of the preferred interpretations by philosophers and those working on foundations.

Thanks
Bill
 
  • #97


bhobba said:
Abso-friggen-lutely.

I have often seen it mentioned BM has much more traction as the preferred or one of the preferred interpretations by philosophers and those working on foundations.
But have you noticed that the book I mentioned shows that BM is actually USEFUL for practical (not merely philosophical or foundational) physical problems?
 
  • #98


Demystifier said:
But have you noticed that the book I mentioned shows that BM is actually USEFUL for practical (not merely philosophical or foundational) physical problems?

Well not having read it I don't have any first hand experience but I take your word for it. And yes it is interesting.

Thanks
Bill
 
  • #99


For any further discussions on whether BM is controversial, I think it is important to distinguish four different types of controversy existing in the physics community:

1. Is BM self-consistent?

2. Is BM consistent with observations?

3. Is BM useful?

4. Is BM simple/beautiful/natural enough?

Practical physicists usually do not have complaints on 1, 2, or 4, but they often argue that BM is not useful. Since most physicists are practical, the controversy of type 3 can be considered the most prevalent. Yet, physicists who find BM unuseful are usually silent about that and simply ignore BM without spelling it out. As a consequence, the type 3 controversy often looks much less prevalent than it really is.

Ironically, despite of being most prevalent, type 3 controversy is the most certainly unjustified. Namely, the book I mentioned definitely demonstrates that, at least in some cases, BM is useful. It is certainly less useful than some more standard techniques of solving QM problems, but the controversy concerns the question is whether it is useful at all. And it definitely is.

Type 2 controversy seems most prevalent in public discussions. But whenever one finds argument that BM is not consistent with predictions of standard QM (and thus with observations), it always turns out that one does not understand the general proof that measurable predictions of BM always agree with those of standard QM. It is like searching for a perpetuum-mobile without understanding the general theorem of energy conservation.
Thus, type 2 controversy is unjustified as well.

Objections of the type 1 do not seem to exist in physics community. It seems that more or less all physicists agree that BM at least does not have internal inconsistencies.

What remains are type 4 controversies. While BM certainly has some advantages over other interpretations concerning their simplicity, beauty and naturalness, it also has some disadvantages of that type. What people disagree on is whether the advantages are stronger than the disadvantages. And frankly, there is no simple and objective way to answer who is right. Therefore, type 4 controversy is the only type of controversy which is really justified.
 
  • #100


Demystifier said:
For any further on whether BM is , I think it is important to distinguish four different types of controversy existing in the physics community:

1. Is BM self-consistent?

2. Is BM consistent with observations?

3. Is BM useful?

4. Is BM simple/beautiful/natural enough?

Practical physicists usually do not have complaints on 1, 2, or 4, but they often argue that BM is not useful. Since most physicists are practical, the controversy of type 3 can be considered the most prevalent. Yet, physicists who find BM unuseful are usually silent about that and simply ignore BM without spelling it out. As a consequence, the type 3 controversy often looks much less prevalent than it really is.

Ironically, despite of being most prevalent, type 3 controversy is the most certainly unjustified. Namely, I mentioned definitely demonstrates that, at least in some cases, BM is useful. It is certainly less useful than some more standard techniques of solving QM problems, but the controversy concerns the question is whether it is useful at all. And it definitely is.

Type 2 controversy seems most prevalent in public discussions. But whenever one finds argument that BM is not consistent with predictions of standard QM (and thus with observations), it always turns out that one does not understand the general proof that measurable predictions of BM always agree with those of standard QM. It is like searching for a perpetuum-mobile without understanding the general theorem of energy conservation.
Thus, type 2 controversy is unjustified as well.

Objections of the type 1 do not seem to exist in physics community. It seems that more or less all physicists agree that BM at least does not have internal inconsistencies.

What remains are type 4 controversies. While BM certainly has some advantages over other interpretations concerning their simplicity, beauty and naturalness, it also has some disadvantages of that type. What people disagree on is whether the advantages are stronger than the disadvantages. And frankly, there is no simple and objective way to answer who is right. Therefore, type 4 controversy is the only type of controversy which is really justified.

what about the Seevinck criterion and bohmian mechanics ?
 
Back
Top