# A Smolin: Realistic and anti-realistic interpretations of QM

#### PeterDonis

Mentor
you or vanhees can easily change my mind by presenting a concrete counter-example - just summarily saying this quite standard classification scheme is incomplete is unfair.
You already gave a counterexample: dynamical modification. In other words, not believing that QM as it currently exists is a final theory. (The specific example you give is just one particular case of this.) The classification scheme you describe assumes that it is. Calling the contrary belief a "pseudo-exception" seems just as unfair to me as summarily saying that the classification scheme you describe is incomplete seems to you.

#### charters

You already gave a counterexample: dynamical modification. In other words, not believing that QM as it currently exists is a final theory. (The specific example you give is just one particular case of this.) The classification scheme you describe assumes that it is. Calling the contrary belief a "pseudo-exception" seems just as unfair to me as summarily saying that the classification scheme you describe is incomplete seems to you.
Dynamical modification exists in order to solve the measurement problem in textbook quantum theory. It is not an option for someone whose stance is to deny the significance of the measurement problem in textbook QT, which is where we began back in #20 and #22. And I've seen no indication in this thread that anyone here actually wants to advocate something like this, which requires replacing the Schrodinger equation with something non-unitary.

So, I think this is a misrepresentation of vanhees position, and these approaches are irrelevant here. But if it somehow is what he meant, then it is a concession that my only point (the measurement problem is real) has been correct all along.

#### PeterDonis

Mentor
Dynamical modification exists in order to solve the measurement problem in textbook quantum theory.
Only in the trivial sense that, if you have a different theory, it obviously doesn't have to share whatever problem you think you see in textbook QM.

It is not an option for someone whose stance is to deny the significance of the measurement problem in textbook QT
Why not? Doesn't saying "QT is not a fundamental theory, so there's no point in even worrying about any measurement problem it might have" (which is basically what I read @vanhees71 as saying) count as denying the significance of the measurement problem in textbook QT?

#### charters

Why not? Doesn't saying "QT is not a fundamental theory, so there's no point in even worrying about any measurement problem it might have" (which is basically what I read @vanhees71 as saying) count as denying the significance of the measurement problem in textbook QT?
Sure, you can say you don't *care* about the issue because you think QT will be replaced (though in reality there is no good reason to believe this as a serious possibility given the nature of quantum gravity research, where no popular and closely studied approach tries to replace QT as the overarching framework).

But that's not what vanhees is saying. In #22: "There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism." They clearly think QT *as is* has no logical/conceptual inconsistencies, in particular not the inconsistency as set up in the Wallace quote I began with.

The argument has been QT is fine, not that we shouldn't care whether or not it is fine.

#### PeterDonis

Mentor
that's not what vanhees is saying. In #22: "There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism." They clearly think QT *as is* has no logical/conceptual inconsistencies, in particular not the inconsistency as set up in the Wallace quote I began with.
The Wallace quote you began with talks about a problem with QT if it is a fundamental theory. It says so right there in the quote. So his argument obviously doesn't apply to any interpretation of QT that does not treat it as a fundamental theory. I read @vanhees71 as saying that if textbook QT is treated as an effective theory that makes correct predictions in its domain but nothing more, then there are no logical/conceptual inconsistencies.

#### charters

The Wallace quote you began with talks about a problem with QT if it is a fundamental theory. It says so right there in the quote. So his argument obviously doesn't apply to any interpretation of QT that does not treat it as a fundamental theory. I read @vanhees71 as saying that if textbook QT is treated as an effective theory that makes correct predictions in its domain but nothing more, then there are no logical/conceptual inconsistencies.
Well, I don't read the thread that way, and you're also misreading Wallace, who certainly does not think the measurement problem is only relevant to strictly fundamental theories. You can look into his many papers and find this out for yourself.

But at this point I'm tired of this. So all I will say is, if your presentation of vanhees's argument is correct, it's quite a bit of wishful thinking to assume or expect quantum theory writ large will be replaced by anything, let alone by something that happens to magically make the measurement problem moot.

#### PeterDonis

Mentor
you're also misreading Wallace, who certainly does not think the measurement problem is only relevant to strictly fundamental theories
If I am, it's certainly not evident from what you quoted. When I have time I'll take a look at his papers to get a more complete view of what he is saying.

#### PeterDonis

Mentor
it's quite a bit of wishful thinking to assume or expect quantum theory writ large will be replaced by anything, let alone by something that happens to magically make the measurement problem moot.
It seems to me to be wishful thinking to assume that a framework for thinking about quantum theory, the one that points up the measurement problem as being the fundamental issue, will suddenly turn out to solve the problem after making no progress on it for many decades.

To put it another way, I would describe the fundamental problem not as "the measurement problem" but as "the quantum foundations problem"--is QM a fundamental theory or not? If it is, then nobody knows how to make it a consistent fundamental theory. If it isn't, then nobody knows what could possibly replace it. Talking about "the measurement problem" basically means you've chosen the first path--QM is a fundamental theory, the problem is how to make it a consistent one. But that problem doesn't even show up on the radar if you choose the second path--QM is not a fundamental theory, the problem is what to replace it with.

#### charters

To put it another way, I would describe the fundamental problem not as "the measurement problem" but as "the quantum foundations problem"--is QM a fundamental theory or not? If it is, then nobody knows how to make it a consistent fundamental theory
This is a topic for another thread, but I think at least one philosophically consistent/acceptable version of QT exists for both fundamental, non-fundamental, and non-physics (i.e., quantum information) applications of QT, so the situation is less bleak than you suggest. However, this comes through reckoning with the measurement problem and biting some bullets, not wishing the problem away and hoping the future will offer some return to classicality. Even if I'm wrong, and the situation is totally bleak, it's still unfair to equate kicking the can with actively trying to do our best with the theory we have right now. Its more honest and responsible to keep trying to make sense of QT, given the highly likely case it doesn't get supplanted. So I don't see these two approaches you outline as equally meritorious.

#### PeterDonis

Mentor
This is a topic for another thread, but I think at least one philosophically consistent/acceptable version of QT exists for both fundamental, non-fundamental, and non-physics (i.e., quantum information) applications of QT
Yes, this would be a topic for another thread.

#### vanhees71

Gold Member
This post demonstrates a clear misunderstanding of what it means to have an ontology: having an ontology means having an actual existence and being ontic simply means actually existing.

Example: Unicorns (one-horned horses) don't have an ontology (or aren't ontic) in the science of biology.

Similarly, any state that actually exists in any literal sense is by definition an ontic state.

If you don't accept this explanation, you are implicitly committing to option 2 from post #63.
Quantum theory provides clear ontics. The actual existence of, e.g., elementary particles is not in question by the probabilistic description of QT. An electron actually exists in the description of relativistic QFT and is described by a quantum field. It doesn't exist as visualized by classical physics as a "point particle" of course, but that's because there's progress in science going way beyond a naive picture based on our experience with macroscopic objects which behave, under the circumstances of everyday life, to an excellent approximation as descxribed by classical physics.

#### vanhees71

Gold Member
Then he should have simply continued to agree to this instrumentalist position we were focusing on in the #40s, instead of rejecting it in the #50s (this is the more complete context). The overall categories I gave are exhaustive, and I accept what you suggest here is a logically possible option - in these later posts I am merely assuming some winnowing has taken place. The problem we've had is as soon as I try to get a firm, particular commitment to move us along, they backtrack on the commitment in order to reject the implications of the commitment that would lead to the measurement problem.

I think this is just not an effective medium/format for the socratic approach I was trying, with the strictly linear comment thread.
Well, you must allow me to have my point of view. It doesn't need to fit into one of your isms. Philosophy is indeed utmost inappropriate to shed light on the modern findings of the natural (and also structural) sciences. I'll take @samalkhaiat 's advice, not to participate in such fruitless discussions anymore. It's useless.

#### vanhees71

Gold Member
Exactly. And the uncertainty principle is the reason why probability enters QM, full stop. Why do need to participate in this kind discussions? Let me just tell you that if this Smolin guy gave that talk in London or Oxford/Cambridge, he would be, after 15 minutes, talking to an empty lecture theatre in Oxford/Cambridge or get booed in London.
Well, I think you are right. I shouldn't waste my time anymore to discuss philosophical issues in this forum. It's kind of fighting against religious beliefs rather than having a constructive scientific discussion.

#### Auto-Didact

I'll take @samalkhaiat 's advice, not to participate in such fruitless discussions anymore. It's useless.
You might enjoy pretending that the practice of physics and science at large, is free from this kind of foundational disagreement that we have as with foundations of QT; nothing could be further from the truth.

The best known case in the history of physics, where problems and paradoxes in the theory led to as much confusion as they do in QM foundations today, was in the 18th and 19th century in fluid mechanics, d’Alembert’s paradox; in fact, this problem can be restated as a problem of the interpretation of the ontological versus epistemological status of a central object in the theory, namely boundary layers - exactly like the problem with $\psi$ in QM foundations.

It is therefore nothing short of a tragedy that this tale isn't universally known among physicists; here a brief retelling is quoted from (Bush, 2015):
John Bush said:
And lest the longevity of the quantum paradoxes be mistaken for their insurmountability, fluid mechanics has a cautionary tale to tell. In 1749, d’Alembert’s paradox indicated that an object moving through an inviscid fluid experiences no drag, a prediction that was clearly at odds with experiments on high–Reynolds number gas flows. The result was a longstanding rift between experimentalists and theorists: For much of the nineteenth century, the former worked on phenomena that could not be explained, and the latter on those that could not be observed (Lighthill 1956). D’Alembert’s paradox stood for over 150 years, until Prandtl’s developments (Anderson 2005) allowed for the resolution of the dynamics on the hitherto hidden scale of the viscous boundary layer.

#### Auto-Didact

Quantum theory provides clear ontics.
You are outright disagreeing with all the experts in the world on this matter, but keep telling yourself that if it makes you sleep better at night.
The actual existence of, e.g., elementary particles is not in question by the probabilistic description of QT. An electron actually exists in the description of relativistic QFT and is described by a quantum field. It doesn't exist as visualized by classical physics as a "point particle" of course, but that's because there's progress in science going way beyond a naive picture based on our experience with macroscopic objects which behave, under the circumstances of everyday life, to an excellent approximation as descxribed by classical physics.
This isn't under question at all. I don't see the need to keep bringing it up. Maybe it would be instructive to state that 'practical physics' purely focussed on applications (i.e. physics as an extension of engineering) is pretty much the opposite of 'foundational physics', wherein everything that has been swept under the rug is exposed so that it can be fixed. See the above post #89.
Why do need to participate in this kind discussions? Let me just tell you that if this Smolin guy gave that talk in London or Oxford/Cambridge, he would be, after 15 minutes, talking to an empty lecture theatre in Oxford/Cambridge or get booed in London.
Again, just because you don't find fundamental physics important, doesn't mean it isn't important.

Smolin actually devotes an entire chapter to discussing what the experts over at Oxford (Deutsch, Greaves, Myrvold, Sauders, Wallace et al.) think about the matter; he refers to their collective stance as critical realism, or more specifically the Oxford interpretation. Oxfordians - like Copenhagenists before them - have the same core belief but disagree to differing degrees on different specific points.

Simply stated, Oxfordians believe that decoherence, a irreversible statistical concept, is completely sufficient to solve the measurement problem. Smolin - like Bell, Shimony, Penrose and @A. Neumaier before him - keenly demonstrates that this argument is actually insufficient because it introduces observers into the foundations of the theory.

The problem with decoherence as a solution to the measurement problem is that if unitary evolution is fundamental to QT, then complete decoherence is impossible because decohered states will recohere if we wait long enough due to the Poincaré recurrence theorem; this is literally the same reason why entropy can increase.

Now if we are only interested in times shorter than it takes to recohere - that is if we are only interested in an approximate description of measurements for all practical purposes (FAPP) - then decoherence works, but as a matter of principle - i.e. as a question of foundational and mathematical physics - decoherence outright fails as a complete explanation.

#### stevendaryl

Staff Emeritus
To put it another way, I would describe the fundamental problem not as "the measurement problem" but as "the quantum foundations problem"--is QM a fundamental theory or not? If it is, then nobody knows how to make it a consistent fundamental theory. If it isn't, then nobody knows what could possibly replace it.
That's a very good way to summarize the situation.

#### stevendaryl

Staff Emeritus
The problem with decoherence as a solution to the measurement problem is that if unitary evolution is fundamental to QT, then complete decoherence is impossible because decohered states will recohere if we wait long enough due to the Poincaré recurrence theorem; this is literally the same reason why entropy can increase.

Now if we are only interested in times shorter than it takes to recohere - that is if we are only interested in an approximate description of measurements for all practical purposes (FAPP) - then decoherence works, but as a matter of principle - i.e. as a question of foundational and mathematical physics - decoherence outright fails as a complete explanation.
To me, there is possibly another problem with decoherence, and that is that, as I understand it, decoherence involves splitting the universe into three parts:
1. The system of interest, which might be a single electron
2. The measuring device
3. Everything else (the "environment")
After making such a split, you can trace out the environmental degrees of freedom, and what you find for the reduced density matrix is that it rapidly evolves into a mixed state. That mixed state can be interpreted as the situation: The measuring device nondeterministically goes into a definite "pointer" state, with probabilities given by the Born rule. So decoherence seems to give the same result as a "measurement collapses the wave function" interpretation without introducing a separate collapse event.

However, it seems subjective to me to split the world into the three parts that way. And it seems inconsistent to interpret a state that you know is an improper mixed state (due to tracing out environmental degrees of freedom) as if it were a proper mixed state (due to ignorance of the actual state).

#### Auto-Didact

To me, there is possibly another problem with decoherence, and that is that, as I understand it, decoherence involves splitting the universe into three parts:
1. The system of interest, which might be a single electron
2. The measuring device
3. Everything else (the "environment")
After making such a split, you can trace out the environmental degrees of freedom, and what you find for the reduced density matrix is that it rapidly evolves into a mixed state. That mixed state can be interpreted as the situation: The measuring device nondeterministically goes into a definite "pointer" state, with probabilities given by the Born rule. So decoherence seems to give the same result as a "measurement collapses the wave function" interpretation without introducing a separate collapse event.

However, it seems subjective to me to split the world into the three parts that way. And it seems inconsistent to interpret a state that you know is an improper mixed state (due to tracing out environmental degrees of freedom) as if it were a proper mixed state (due to ignorance of the actual state).
Agreed. Incidentally, that is more or less the same argument Penrose made 20 years ago against decoherence in The Road To Reality.

The problem with decoherence as a solution to the measurement problem is that if unitary evolution is fundamental to QT, then complete decoherence is impossible because decohered states will recohere if we wait long enough due to the Poincaré recurrence theorem; this is literally the same reason why entropy can increase.

Now if we are only interested in times shorter than it takes to recohere - that is if we are only interested in an approximate description of measurements for all practical purposes (FAPP) - then decoherence works, but as a matter of principle - i.e. as a question of foundational and mathematical physics - decoherence outright fails as a complete explanation.
Why, what's wrong with recoherence as part of the complete story? Presumably for a system like our universe the recoherance time could be far enough in the future that all structure would have long ago been lost due to heat death.

#### Auto-Didact

Why, what's wrong with recoherence as part of the complete story?
If decoherence solves the measurement problem per the Born rule, then it should be effectively completely irreversible; the fact is that decoherence is always incomplete and therefore, per Poincaré recurrence, reversible. Ergo, it cannot solve the measurement problem.

In orthodox QM, the act of measurement is de facto irreversible. Upon measurement, unitary evolution restarts again with effectively 'new initial conditions'; this process is not reversible even if unitary evolution itself up to this point is reversible. In other words, decoherence is patently insufficient to solve the measurement problem.

#### A. Neumaier

Well, you must allow me to have my point of view. It doesn't need to fit into one of your isms. Philosophy is indeed utmost inappropriate to shed light on the modern findings of the natural (and also structural) sciences. I'll take @samalkhaiat 's advice, not to participate in such fruitless discussions anymore. It's useless.
Its fruitless because you insist on your own rules. You preserve a sort of effective consistency by the same sort of vagueness of language that you criticize in Bohr's writing. You adhere to a conceptual uncertainty principle, with no attempt to match your concepts to those used by the community that discusses foundational questions in a more precise way. This makes foundational discussions with you frustrating for all participants.

#### vanhees71

Gold Member
It's also very frustrating for me, because I never get comprehensible explanations from the philosophical side. And it's not "my rules", but the way modern Q(F)T is successfully applied for nearly 100 years now, including the most modern applications in quantum optics and quantum information physics, which are both closest to these fundamental topics as it can be as far as the physics is concerned.

I think it's just impossible to discuss the issue strictly staying in the realm of physics without distorting the subject by philosophical arguments which are completely irrelevant for the scientific side of the matter, which could be interesting, but I give up these discussions from now on.

#### A. Neumaier

it's not "my rules"
Everyone but you (even Ballentine) allows for collapse, and everyone but you (even Peres) acknowledges that there is a problem applying the statistical interpretation of QM to large systems such as the solar system, whose preparation cannot be replicated multiple times.
I give up these discussions from now on.
Yes, it is fruitless.

If decoherence solves the measurement problem per the Born rule, then it should be effectively completely irreversible; the fact is that decoherence is always incomplete and therefore, per Poincaré recurrence, reversible. Ergo, it cannot solve the measurement problem.
Why do you define the "measurement problem" this way? I think most people consider it to be something different; it just needs to describe our universe. What unitary QM predicts, at least considering the other assumptions of Poincaré recurrence (maybe finite dimensionality) is that eventually recoherance will happen. You seem to be a priori ruling that out.

#### A. Neumaier

What unitary QM predicts, at least considering the other assumptions of Poincaré recurrence (maybe finite dimensionality) is that eventually recoherance will happen.
Can you point to a theorem proving a recurrence theorem in the quantum case?

Poincaré recurrence is for finite-dimensional bounded dynamical systems only. Already a single hydrogen atom violates both assumptions, let alone the universe.

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving