# A Smolin: Realistic and anti-realistic interpretations of QM

#### Auto-Didact

Summary
Lee Smolin has a new book out called "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum". It is a book on the foundations of QM. For a brief review of the first half see: https://www.physicsforums.com/threads/what-are-you-reading-now-stem-only.912884/post-6176357

SPOILER WARNING

What I believe we can directly take away from this book with regard to discussions on QM foundations is I think his classification of interpretations of QM.
SPOILER WARNING: If you don't want to be spoiled about the book, stop reading now!

I just finished the new book. First I would recommend the book to anyone who reads or takes part in discussions on QM foundations. Briefly put, Smolin offers a simple classification of almost all interpretations of QM and their pros and cons with respect to a more fundamental theory than QM or QFT. The classification he gives for the interpretation of QM is realist vs anti-realist, where realism is the view of reality which all scientific theories of physics (except for QM) adhere to, and where anti-realism is essentially the instrumentalist view of QM and science at large as propagated by Bohr and Heisenberg. Suffice to say, the standard textbook operationalist view of QM is also anti-realist.

Realism on the other hand branches off into a few more specific views. The most important of these are what Smolin calls naive realism, magical realism and critical realism. Each of these branches consist of groups of theories which are fundamentally conceptually similar to each other, i.e. they have the same strengths and weaknesses.

Naive realism gives several options about what is real: either both particles and waves are real (pilot wave theory), only waves are real (collapse models) or only particles are real (Nelson's stochastic mechanics).

Magical realism has a few exemplifying interpretations, most importantly Everett's Many Worlds Interpretation. This interpretation is less predictive than QM because it literally predicts everything happens based on deterministic unitary evolution alone and therefore has a problem introducing probabilities.

Critical realism contains again a few exemplifying options: the Oxford interpretation, which can best be summarized as 'decoherence solves the issues with Everett's MWI.' In general, the way in which the problems of the MWI are attempted to be solved do not seem to work for several reasons.

Smolin in the book reviews all of the above interpretations, impartially gives both their merits and drawbacks, and more importantly explains what each of them teaches us about physics and what a successful realist completion of QM would need to be capable of achieving in order to become the theory to dethrone QM, i.e. simultaneously reproducing each of the successes of all of the interpretations while avoiding all of their problems.

Smolin then gives an outline how to achieve such a project based on a first principles approach similar to Lucien Hardy's approach for doing foundational physics. He names and describes a set of principles which a realist completion of QM and therefore a theory beyond QM needs to adhere to. From memory I think this approach is based largely on Einstein's philosophy of physics which clearly illustrates the difference between constitutive theories and principle theories.

The rest of the book illustrates a few specific implementations of research done so far which actually complete QM and go beyond it, i.e. which have actually reproduced all the successes of many of the interpretations so far. These projects are each very impressive in their own right, but - as Smolin describes - while they may reproduce some or all of the successes, they do not necessarily avoid all the problems.

Naturally, anyone who wants clarification should read the book, I'd recommend it to anyone interested in QM, regardless of their level of expertise. Again any further discussion in this thread will necessarily go into more depth than this post and spoil the book even more.

Related Quantum Physics News on Phys.org

#### Aidyan

I'm wondering if among these realist or anti-realist options also Berkeley's idealism (or sort of) has ever been considered?

#### vanhees71

Gold Member
Well, I'd consider myself a realist but for me what's usually called "realist" in the context of quantum interpretations is the opposite of what I consider realistic.

Realistic for me is what can be objectively observed and even (more or less precisely) quantified, and as the discovery of QT has taught us that's very different from what's usually considered as "realistic interpretation", i.e., what these people call "realistic interpretation" (or more precisely "ontic interpretation") is the opposite of what's really observed, namely that indeed everything works as in the "non-realistic" (minimal) standard interpretation, namely that with a complete determination of the quantum state (i.e., the preparation of a system in a pure quantum state, represented by a statistical operator of the form $\hat{\rho}(o_1,\ldots,o_n)=|o_1,\ldots,o_n \rangle \langle o_1,\ldots,o_n|$ only some observables, particularly the chosen set of observables $O_1,\ldots,O_n$ that form a complete set of compatible observables, i.e., define uniquely a one-dimensional common eigenspace, to which $\hat{\rho}$ projects, i.e., $|o_1,\ldots,o_n \rangle$ is a normalized common eigenvector of the self-adjoint operators $\hat{O}_1,\ldots,\hat{O}_n$, representing the observables.

Then of course any observable $O'$ that is not compatible with the set $O_1,\ldots,O_n$ of observables (almost always) doesn't take a determined value, but the state preparation implies (and only implies!) the probability to find a certain value when measuring $O'$.

This for me is what's realistic, because that's what's really observed in the lab. The very confusing language of philosophers and also some physicists however calls this empirically very well established observation as "un-realistic".

For me, there are no problems with this minimal interpretation of quantum theory and no necessity for any adjustment to it (let alone QT in its minimal form itself), as long as there's no clear empirical contradiction with its predictions.

Concerning the socalled "wave-particle dualism", that seems to me a horse as dead as it can be. It was resolved as non-existent already in 1926 when Born came up with the probabilistic interpretation of the quantum formalism. There's wimply no wave-particle dualism only the (in my opinion also very well established fact) that there exist neither classical particles nor classical waves but just a probabilistic description of nature in terms of quantum (field) theory.

The phenomenon that we observe many things around us as behaving like classical particles (in fact macroscopic extended bodies consisting of very many particles) and some like classical fields (mostly electromagnetic waves in form of light) is well understood from many-body quantum (field) theory as well.

#### Auto-Didact

Well, I'd consider myself a realist but for me what's usually called "realist" in the context of quantum interpretations is the opposite of what I consider realistic.
Personal interpretations are a large part of the problem in discussing foundational issues: definitions simply aren't up for grabs. There is a vast literature on the subject.
Realistic for me is what can be objectively observed and even (more or less precisely) quantified
That is a personal philosophy of science which only works in a pragmatic sense, it also has a name and Smolin describes it carefully; making an equivocation between being observable and being real is a philosophy called 'instrumentalism' within the literature. In the most general context, i.e. logically, what can be observed need not be what is real; in the literature that what can be real is called a 'beable'.

Instrumentalism is a strategy which has short-term benefits, e.g. making experimental analysis, engineering and technological advancement possible, while long-term it tends to be purely counterproductive for the science in question itself, in this case physics.
For me, there are no problems with this minimal interpretation of quantum theory and no necessity for any adjustment to it (let alone QT in its minimal form itself), as long as there's no clear empirical contradiction with its predictions.
The problem is again not a personal one nor an applied one, but an academic one; there is a well known problem called the measurement problem of QM. This is arguably the biggest and most longstanding problem in modern physics.

More specifically this problem is a fundamental problem in mathematical theoretical physics, because it introduces purely logical consistency problems in giving a completely mathematically self-consistent definition of what a physical theory is itself.

The measurement problem of QM resides not within the predictions of the theory of QM/QFT but within the change of the mathematical character of physical theory itself, i.e. essentially if QT is the final fundamental theory then the inherent properties of the laws of physics do not adhere to the mathematical structure of differential equations and inherently related concepts anymore.

In other words, the very existence of the Born Rule isn't necessarily problematic. Instead, what is problematic is that in addition to differential equations there are other mathematical objects, objects which are purely mathematically fundamentally inconsistent with differential equations, which lay claim to being essential to the properties of what a law of physics is. This inconsistency marks the embodiment of what the measurement problem entails.

Accepting this inconsistency as 'non-problematic' - as most physicists have done since Bohr and Heisenberg - is a cavalier decision which marks the very degradation of the historical endeavour of physics from a modern science into a distinctly postmodern science; most physicists which make this decision don't realize that this is what they are doing.

Let me spell this acceptance out explicitly: the central objects of fundamental physical theory - namely the laws of physics - are not merely allowed (out of temporary inconvenience) to be a combination of mathematically inconsistent objects for the time being which may be unified given more knowledge, but are instead accepted and maintained to necessarily be a combination of mathematically inconsistent objects in exactly the manner they are known for all time forward regardless of new discoveries, purely for the short-term advantages that this viewpoint seems to bring with it as long as all else is carefully ignored.
Concerning the socalled "wave-particle dualism", that seems to me a horse as dead as it can be. It was resolved as non-existent already in 1926 when Born came up with the probabilistic interpretation of the quantum formalism.
This misses the mark because Smolin says nothing whatsoever about wave-particle duality, which I think we both agree is a completely misguided effort. Instead he (and I) use wave as shorthand for wavefunction.

#### vanhees71

Gold Member
I don't understand, why you think, it's a problem if physical laws are not written in terms of differential equations. Then also classical kinetics is already problematic, because it's a integro-differential equation, or the Langevin equation which is a stochastic (integro-)differential equation.

I don't see the socalled measurement problem as a physical theorem but the lack of aconsistent QT of gravity.At this point it may well be that we need a new theory, and I don't claim that we have a "final theory" before this new theory has been found.

#### charters

This for me is what's realistic, because that's what's really observed in the lab. The very confusing language of philosophers and also some physicists however calls this empirically very well established observation as "un-realistic".
This is anti-realist in the sense above because your interpretation does not treat the observer as a physical system subject to the same mathematical treatment/physical theory as the world he observes, eg your observer is not a factor of a tensor product Hilbert space.

You can say its unfair to call this "realist" if you want. Indeed in other non-ontology conversations, like in political theory for example, "realist" means "very practical and unambitious" which would suggesta parallel in sticking to only strictly empirical claims in science. But, this is just the normal lack of clarity of English, and it is a losing battle to try to redefine terms of art, well established in a specific debate.

Even worse, "realism" means yet something else in the old EPR-Bell argument. It means something closer to counterfactual definiteness, and in this sense MWI is not realist (though still realist per Smolin's also common usage)

#### Klystron

Gold Member
Thanks for the recommendation. I reserved a copy of Smolin's book for pickup this week. The description and subsequent discussion in no way spoils my anticipation, rather the opposite.

#### Auto-Didact

I don't understand, why you think, it's a problem if physical laws are not written in terms of differential equations. Then also classical kinetics is already problematic, because it's a integro-differential equation, or the Langevin equation which is a stochastic (integro-)differential equation.
I am simplifying by saying differential equations, more true would be to say:
differential equations and inherently related concepts
The mathematical object which captures the properties of a law of physics most accurately are differential equations from the still incomplete theory of differential equations in pure mathematics.

Direct specifications and generalizations of differential equations - such as ODEs, PDEs, SDEs and integro-DEs - as well as indirect specifications, generalizations and abstractions - such as derivatives, difference equations, linear operators and so on - all share the same common features as differential equations, in that they can be viewed as being inherently representative of this overall conceptual notion which we associate with analysis.

Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory which naturally unifies algebraic geometry, complex analysis, algebraic topology, Riemannian geometry, the theory of Riemann surfaces, the theory of dynamical systems, renormalization group theory, the theory of modular forms and so on - in a manner similar to the Langlands program.

This novel unified mathematical theory is thought to be the true underlying mathematical theory behind physics, and would also explain why physical theories have the structure that they do, why approximative techniques in physics work at all, why physical theories generalize in the unique way that they do - essentially explain 'the unreasonable effectiveness of mathematics in the natural sciences' as Wigner put it. QT, with QFT seemingly being its ultimate mathematical form, for all it does, clearly does nothing of the sort.

By the way, an explicit implementation of this still unknown unified theory of mathematics is conjectured to already exist and be known to almost everyone: it is called string theory, and it represents an explicit unification of all those fields in mathematics above and more, based mostly on K theory and the theory of Riemann surfaces. Unless a physical theory in its mature stage is capable of doing something like this for pure mathematics, there is no reason to place any hope in a theory as being a fundamental theory of physics.

Because string theory today doesn't seem to be what physicists and mathematicians once thought it was, and some still think it is, that will mean at worst it is merely equivalent to QFT per e.g. AdS/CFT.
I don't see the socalled measurement problem as a physical theorem but the lack of aconsistent QT of gravity.At this point it may well be that we need a new theory, and I don't claim that we have a "final theory" before this new theory has been found.
I don't see it as a physical theorem either, and the current lack of a consistent theory of QG as a symptom that QT as is must be incomplete, i.e. not being the final theory, so we agree here then.

#### A. Neumaier

Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory
Is this your conjecture or Smolin's? Surely it is not a fact.

#### A. Neumaier

I don't see the socalled measurement problem as a physical theorem but the lack of a consistent QT of gravity.
The measurement problem appears already when treating the solar system as a quantum system, with a classical external gravitational potential, since all measurements we know of are done from wihin the solar system.

#### Auto-Didact

Is this your conjecture or Smolin's? Surely it is not a fact.
Neither mine nor Smolin. It is a recurrent theme among the writings of different physicists and mathematicians, scattered across the centuries. Most explicitly it has been mentioned by physicists and mathematicians such as Wigner, Poincaré, Weyl, Penrose, Atiyah, Frenkel, Witten, Strogatz and 't Hooft scattered across their many works. The Langlands program is aligned with this old historical goal of mathematical theoretical physics.

The general theme that physical theory is our best bet for discovering the most beautiful and farreaching theories of pure mathematics is a recurring theme in the practice of theoretical physics since Newton onwards. Physics as a discipline, from this point of view is a natural amalgamation of natural philosophy and mathematics, in which the canonical concepts of physics and classical applied mathematics represent the most clean models of empirical concepts.

It is only recently that physicists and pure mathematicians stopped believing in this, i.e. for mathematicians after the rise of formalism under Hilbert and for physicists after the subsequent championing of QT by Bohr et al as a distinctly non-realistic science. The untimely death of Poincaré, i.e. the only philosopher of physics of sufficient calibre with enough breadth and depth of knowledge of these issues at the time, capable of adequately explaining these issues, practically guaranteed the divorce between mathematics and physics. This marriage has since only been rekindled partially through the discovery of the theory of dynamical systems and the string theory revolutions.

#### cube137

Summary: Lee Smolin has a new book out called "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum". It is a book on the foundations of QM. For a brief review of the first half see: https://www.physicsforums.com/threads/what-are-you-reading-now-stem-only.912884/post-6176357
Smolin is a gravity guy. Do you know of similar theme books that explores say the realist and anti-realist interpretations of general relativity?

SPOILER WARNING

What I believe we can directly take away from this book with regard to discussions on QM foundations is I think his classification of interpretations of QM.

SPOILER WARNING: If you don't want to be spoiled about the book, stop reading now!

I just finished the new book. First I would recommend the book to anyone who reads or takes part in discussions on QM foundations. Briefly put, Smolin offers a simple classification of almost all interpretations of QM and their pros and cons with respect to a more fundamental theory than QM or QFT. The classification he gives for the interpretation of QM is realist vs anti-realist, where realism is the view of reality which all scientific theories of physics (except for QM) adhere to, and where anti-realism is essentially the instrumentalist view of QM and science at large as propagated by Bohr and Heisenberg. Suffice to say, the standard textbook operationalist view of QM is also anti-realist.

Realism on the other hand branches off into a few more specific views. The most important of these are what Smolin calls naive realism, magical realism and critical realism. Each of these branches consist of groups of theories which are fundamentally conceptually similar to each other, i.e. they have the same strengths and weaknesses.

Naive realism gives several options about what is real: either both particles and waves are real (pilot wave theory), only waves are real (collapse models) or only particles are real (Nelson's stochastic mechanics).

Magical realism has a few exemplifying interpretations, most importantly Everett's Many Worlds Interpretation. This interpretation is less predictive than QM because it literally predicts everything happens based on deterministic unitary evolution alone and therefore has a problem introducing probabilities.

Critical realism contains again a few exemplifying options: the Oxford interpretation, which can best be summarized as 'decoherence solves the issues with Everett's MWI.' In general, the way in which the problems of the MWI are attempted to be solved do not seem to work for several reasons.

Smolin in the book reviews all of the above interpretations, impartially gives both their merits and drawbacks, and more importantly explains what each of them teaches us about physics and what a successful realist completion of QM would need to be capable of achieving in order to become the theory to dethrone QM, i.e. simultaneously reproducing each of the successes of all of the interpretations while avoiding all of their problems.

Smolin then gives an outline how to achieve such a project based on a first principles approach similar to Lucien Hardy's approach for doing foundational physics. He names and describes a set of principles which a realist completion of QM and therefore a theory beyond QM needs to adhere to. From memory I think this approach is based largely on Einstein's philosophy of physics which clearly illustrates the difference between constitutive theories and principle theories.

The rest of the book illustrates a few specific implementations of research done so far which actually complete QM and go beyond it, i.e. which have actually reproduced all the successes of many of the interpretations so far. These projects are each very impressive in their own right, but - as Smolin describes - while they may reproduce some or all of the successes, they do not necessarily avoid all the problems.

Naturally, anyone who wants clarification should read the book, I'd recommend it to anyone interested in QM, regardless of their level of expertise. Again any further discussion in this thread will necessarily go into more depth than this post and spoil the book even more.

#### stevendaryl

Staff Emeritus
Realistic for me is what can be objectively observed and even (more or less precisely) quantified
That is the opposite of realism.

#### martinbn

I am sorry but I do take issue with bringing the Langlands program in this. What does it have to do with any of this! (By the way don't give me references to the geometric Langlands program, that is not the Langlands program)
Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory which naturally unifies algebraic geometry, complex analysis, algebraic topology, Riemannian geometry, the theory of Riemann surfaces, the theory of dynamical systems, renormalization group theory, the theory of modular forms and so on - in a manner similar to the Langlands program.
Can you be more specific and give quotes or references where anyone has alluded to this deeper mathematical theory. To me it sounds like your own personal wishful thinking.

#### vanhees71

Gold Member
Just to make my point of view clear. I don't deny that there may be some philosophical problems with irreducible probabilistic elements in the foundations, but I think they are philosophical problems and not physical ones, and I don't think that their resolution contributes anything to physics.

Rather, a real big problem is the incompatibility of general relativity (or any other relativistic theory of gravitational interactions) with QT. I think if somebody came up with a solution of this problem there could be real progress of physics.

However you call my standpoint (maybe its instrumentalist, I don't care), that's how QT is really used in physics labs, and so far if you stay really within the realm of physics there's no problem with it. To the contrary, although there's a high interest of the high-energy particle-physics community to find "physics beyond the standard model", so far the standard model withstands all tests and that although there are indeed some intrinsic mathematical issues with QFT as a mathematical theory. Maybe these mathematical problems are also related to the question whether there is a working quantum theory of gravitation, and maybe the solution of one problems contributes to the other.

The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.

#### Demystifier

2018 Award
The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.

#### Auto-Didact

Smolin is a gravity guy. Do you know of similar theme books that explores say the realist and anti-realist interpretations of general relativity?
General relativity, in contrast to QT, doesn't require a discussion of its interpretation since GR's foundations aren't a huge mess, quite the opposite; as a consequence I don't necessarily have a similar book to offer you for gravity. Perhaps Carlo Rovelli's Quantum Gravity or Smolin's prior book Time Reborn?
I am sorry but I do take issue with bringing the Langlands program in this. What does it have to do with any of this!
The Langlands Program can be seen as a 'grand unification of mathematics'; the unified mathematical theory I am referring to has the same property, but perhaps more modest in how much of mathematics it is aspiring to unify.

Such unification occurs naturally by implying the existence of a single consistent underlying mathematical theory behind advanced mathematical physics models, in particular advanced models which are capable of subsuming all of known physics.

For illustrative purposes, GR was once such an advanced mathematical physics model which had an underlying unified mathematical theory, Riemannian geometry, which of course unifies geometry, analysis and topology in unforeseen ways. This of course isn't the historical route of how Riemannian geometry was discovered but it would be foolhardy to claim such a route would've been absolutely impossible.
Can you be more specific and give quotes or references where anyone has alluded to this deeper mathematical theory. To me it sounds like your own personal wishful thinking.
Just a handful of examples:
Andre Weil: "The analogies that Dedekind demonstrated were easy to understand. For integers one substituted polynomials in x, to the divisibility of integers corresponded the divisibility of polynomials (it is well known, and it is taught even in high schools, that there are other such analogies, such as for the derivation of the greatest common divisor), to the rationals correspond the rational fractions {[?of polynomials, or the rational functions]}, and to algebraic numbers correspond the algebraic functions. At first glance, the analogy seems superficial; to the most profound problems of the theory of numbers (such as the decomposition into prime ideals) there would seem to be nothing corresponding in algebraic functions, and inversely. Hilbert went further in figuring out these matters; he saw that, for example, the Riemann-Roch theorem corresponds to Dedekind’s work in arithmetic on the ideal called “the different”; Hilbert’s insight was only published by him in an obscure review (Ostrowski pointed me to it), but it was already transmitted orally, much as other of his ideas on this subject. The unwritten laws of modern mathematics forbid writing down such views if they cannot be stated precisely nor, all the more, proven. To tell the truth, if this were not the case, one would be overwhelmed by work that is even more stupid and if not more useless compared to work that is now published in the journals. But one would love it if Hilbert had written down all that he had in mind. Let us examine this analogy more closely. Once it is possible to translate any particular proof from one theory to another, then the analogy has ceased to be productive for this purpose; it would cease to be at all productive if at one point we had a meaningful and natural way of deriving both theories from a single one. In this sense, around 1820, mathematicians (Gauss, Abel, Galois, Jacobi) permitted themselves, with anguish and delight, to be guided by the analogy between the division of the circle (Gauss’s problem) and the division of elliptic functions. Today, we can easily show that both problems have a place in the theory of abelian equations; we have the theory (I am speaking of a purely algebraic theory, so it is not a matter of number theory in this case) of abelian extensions. Gone is the analogy: gone are the two theories, their conflicts and their delicious reciprocal reflections, their furtive caresses, their inexplicable quarrels; alas, all is just one theory, whose majestic beauty can no longer excite us. Nothing is more fecund than these slightly adulterous relationships; nothing gives greater pleasure to the connoisseur, whether he participates in it, or even if he is an historian contemplating it retrospectively, accompanied, nevertheless, by a touch of melancholy. The pleasure comes from the illusion and the far from clear meaning; once the illusion is dissipated, and knowledge obtained, one becomes indifferent at the same time; at least in the Gitâ there is a slew of prayers (slokas) on the subject, each one more final than the previous ones."

Roger Penrose: the unified mathematics behind twistor theory, described in among others The Road to Reality.
Michael Atiyah: Arithmetic Physics described in a few of his papers and lectures
Edward Frenkel: the Langlands Program, described in talks papers and a popular book.
John Baez: the unifying role of Category Theory, described across dozens of posts, papers and blogs.

Also, I am not saying that all these unification programmes are the same thing, merely that they are striving towards similar goals.
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.
Making such a suggestion is de facto a philosophical step; this insight is what vanhees is missing.
Is this your conjecture or Smolin's? Surely it is not a fact.
Perhaps I am giving too much credit to DEs; as I have mentioned before what I'm trying to describe is my intuition not some formal definition of DEs, dynamical systems or whatnot; perhaps simply saying functions or maps is sufficient? I doubt it, because these definitions don't necessarily capture the structure of the deeper theory I am describing, in that such descriptions seem way too vague, arbitrary, general and/or abstract.

Whatever the deeper mathematical theory underlying the structure of the as of yet unknown more fundamental physical laws turns out to be, it is certainly natural - even rational - to assume that they will be capable of being stated in a conceptually coherent manner with respect to the manner in which canonical physical theory has progressed so far, namely deterministically, with DEs or some generalization thereof as the core concept.

Last edited:

#### zonde

Gold Member
Rather, a real big problem is the incompatibility of general relativity (or any other relativistic theory of gravitational interactions) with QT. I think if somebody came up with a solution of this problem there could be real progress of physics.
So you say that there is nothing to fix about general relativity and there is nothing to fix about QT, the only problem is that they do not stick together and if somebody could just fix this not-sticking-together problem. Is this right?
The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.
Of course philosophy can not give you new physics. I see your doubts are completely justified.
But you are missing the thing that philosophy can do. It can help to get rid of the junk clearing the space for new physics.

#### vanhees71

Gold Member
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.
Yes, sure, but that's not entirely just philosophy. I also don't think that relativistic QFT is "the final theory". It's not even mathematically completely consistent. I think Weinberg is right in writing that QFT is an effective theory in any case. Whether or not there is a more comprehensive theory, from which QFT can be deduced as an effective theory, I can't say either.

Looking at the history of physics as we understand it (it physics beginning with, say, Kepler, Galilei, and Newton) I don't think that the fundamental issues can be solved by theory alone, but we need more empirical input, where Q(F)T unanimously fails. E.g., to get an idea of how a consistent quantum theory of gravitation might look like, we'd need some gravity-related phenomenon clearly contradicting classical-field-theory like behavior. That's of course very difficult to achieve, because gravity is so weak compared to the other interactions that it only leads to clearly observable effects on macroscopic systems which are almost always not sensitive to specific quantum effects.

#### vanhees71

Gold Member
So you say that there is nothing to fix about general relativity and there is nothing to fix about QT, the only problem is that they do not stick together and if somebody could just fix this not-sticking-together problem. Is this right?

Of course philosophy can not give you new physics. I see your doubts are completely justified.
But you are missing the thing that philosophy can do. It can help to get rid of the junk clearing the space for new physics.
I don't believe it's that simple. Otherwise somebody would have done it. So far, all observations concerning gravity are in accord with GR, but that's likely to be the case, because all we can observe concerning gravity is about its action on macroscopic systems, and there the classical theory is very accurate (in close analogy to our everyday experience that classical electrodynamics/optics is very accurate although here we know QED as the underlying quantum theory). So it's very hard to find specific phenomena where and how GR (or maybe some other classical field theory describing gravitation better, although I don't know of any clear empirical hint that this might be the case) has to be joined with QT.

I don't think that there's anything wrong with QT in the realm where we really need it to describe the phenomena. It's not consistingly describing the gravitational interactions, and that's imho the only clear physical hint of failure of QT or rather its incompleteness. What I don't consider real physics problems are these debates about its probabilistic nature and apparently "weird" quantum phenomena like entanglement. We are simply unused to them in our everyday experience, but as weird as they may seem to our in quantum matters quite untrained experience, has always been found to be correct:

E.g., where QT says we don't know more than the probabilistic content it provides, we have failed to find a deterministic theory describing the phenomenon: E.g., so far we don't have any idea how to precisely predict when a specific radioactive nucleus decays. We know what causes the decay (e.g., if it's a $\beta$ emitter the weak interaction), but the best-working theory (quantum flavor dynamics in this case) cannot provide a precise predicts at which time the decay occurs. It only gives a "mean lifetime" of the nucleus, i.e., a probabilistic notion. So far we don't have any hint of whether there is some deterministic explanation for the precise time the decay occurs. As long as this is not the case, I don't see why there's a problem with this probabilistic nature of quantum theory. The same holds for the so-called "measurement problem". I think there's no measurement problem, because QT accurately describes even the most precise measurements modern technology enables us to achieve, including 100% correlations of completely random properties as described by entanglement in the very highly significant sense of contemporary Bell experiments of various types. Although there seems to be not the slightest hint to a deterministic description. Whether there is none, of course, I cannot say. Maybe one day somebody comes up with a very clever determinstic non-local theory describing all facts as well as QT does today in its probabilistic sense, but I doubt that we will find it by pondering purely philosophical quibbles about the so-called "measurement problem" or other "foundational problems" of a purely philosophical nature. I think, as with the question about a consistent QT of gravity, we'd need some empirical evidence clearly indicating that there's a real problem in describing an unanimously observed phenomenon which contradicts QT.

#### charters

So far we don't have any hint of whether there is some deterministic explanation for the precise time the decay occurs. As long as this is not the case, I don't see why there's a problem with this probabilistic nature of quantum theory. The same holds for the so-called "measurement problem". I think there's no measurement problem, because QT accurately describes even the most precise measurements
The measurement problem has nothing to do with the precision of measurements or with the need for classically deterministic predictions. It is a question of an apparent logical contradiction in the traditional axioms of quantum theory. In my opinion, the best statement of the problem is courtesy of David Wallace:

"We cannot consistently understand the state space of quantum theory either as a space of physical states, or as a space of probability distributions. Instead, we have to use one interpretation for microscopic physics and another for macroscopic physics. Furthermore, both the point at which we have to transition between the physical and probabilistic interpretation, and the basis with respect to which the probabilistic interpretation is to be specified, are defined only in an approximate, rough-and-ready way, which seems to make essential use of terms like “macroscopic” which have no place in a fundamental physical theory."

The "realist" interpretations - many worlds, hidden variables, and objective (GRW) collapse -try to find a way to consistently say quantum states are always physical.

The "nonrealist" interpretations - QBism and Copenhagen understood correctly - try to find a way to say quantum states are always probability distributions.

#### vanhees71

Gold Member
But the quote is obviously wrong, because we very well can use quantum theory to describe real-world experiments, and there's both the notion of the state, described by the statistical operator, its deterministic (!) time evolution, given the Hamiltonian of the system, and its probabilistic interpretation. The state is determined by a preparation procedure, and it implies that not all observables of the system takes determined values, but a measurement of these observables give random results with probabilities given by the states. There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism. That's why there is no physical problem with quantum mechanics in this sense.

As I stressed, I don't believe that QT is "the final theory", but the problem are not the philosophical quibbles usually discussed in these debates about "interpretation" but real physics problems as a lack of understanding of how gravitation is described consistently with QT.

#### charters

The state is determined by a preparation procedure, and it implies that not all observables of the system takes determined values, but a measurement of these observables give random results with probabilities given by the states. There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism
What is the nature of this indeterminacy you mention? Is it that the system is physically/ontologically smeared across its possible values, so the value exists in an unsharp way? Is it that the value sharply in an eigenstate, and we just can't say which until measurement? Or is the value truly non-existent until it is measured?

#### Auto-Didact

What I don't consider real physics problems are these debates about its probabilistic nature and apparently "weird" quantum phenomena like entanglement. We are simply unused to them in our everyday experience, but as weird as they may seem to our in quantum matters quite untrained experience, has always been found to be correct:
This is actually an old and tired argument that uncritical teachers tell their freshmen students in order not to make them run away screaming. I remember making the exact same claim to someone else that you are making right here during my days as a naive undergraduate, i.e. before I was properly exposed to the history or philosophy of physics.
Maybe one day somebody comes up with a very clever determinstic non-local theory describing all facts as well as QT does today in its probabilistic sense, but I doubt that we will find it by pondering purely philosophical quibbles about the so-called "measurement problem" or other "foundational problems" of a purely philosophical nature.
Conceptual matters are always solved by making a philosophical leap against "what is known to be true", i.e. by taking another conceptual stance against some dominant viewpoint and then making this new stance mathematically intricate. Turning a conceptual stance into a mathematically intricate picture is particular to physics, but changing conceptual stances is a purely philosophical endeavor which occurs across all forms of specialized and unspecialized human reasoning.

For example, Newton directly chose to contradict millenia of scientific teachings about physics not because of the results of experiments, but because he deeply believed everyone was wrong and he was right and he could back up his own ideas by constructing a new mathematicized explanation of the world, superior to the old explanation; in order to carry this project out he had to invent a new form of mathematics to embody his own philosophical conceptualizations (NB: of course today we know, he wasn't right, he was merely 'less wrong').

Similarly, Einstein chose directly to contradict the Newtonian conceptualization of the world with that of his own basing it on new unproven mathematics which he had little experience with and which had no known academic precedent nor application in science, let alone in fundamental physics. This of course while all scientists at the time "knew" that Newton was correct, in an essentially uncritical and almost gullible uninquisitive way of blindly reasoning about what one already knows while being willfully ignorant against all new conceptualizations of the world; this cognitive bias is in no way particular to early 20th century physicists.

Today we are again in the situation that all scientists "know" that QT is correct, and they cook up all kinds of frankly irrelevant justifications (e.g. measurement precision arguments or quoting the wide range of experiments performed so far). Yet they seem to fail to realize that the mathematics underlying this theory - this conceptualization - has serious (self-)consistency errors and therefore are problematic from a non-formalist pure mathematics point of view; this has nothing to do with being constructable from axioms, or being definable using some semi-abstract mathematics.

The inconsistency of the theory instead follows from the conceptualization being poorly understood, leading to this conceptualization being equally poorly mathematicized with limitations built in, i.e. forcefully mathematicized within a very definite but approximative limited artificial mathematical framework. It is this limited approximate artificial mathematical framework which prevents QT from being capable of being unified with another conceptualization, GR, because GR is in contrast conceptually extremely coherent and therefore its underlying mathematics is exquisitely clean and completely natural.
I think, as with the question about a consistent QT of gravity, we'd need some empirical evidence clearly indicating that there's a real problem in describing an unanimously observed phenomenon which contradicts QT.
That would be nice, but certainly isn't a necessity. Theory usually comes before the experiment, with a myriad of empirically indistinguishable explanations to choose from until some key experiments clearly favor one particular theory. Momentarily we aren't in the stage yet for such experiments to be carried out, but most experts agree we might be within a decade or two of being capable of falsifying scientifically legitimate alternatives to QT which are indistinguishable within the current range of experimental validity.

#### A. Neumaier

The state is determined by a preparation procedure
This only covers states of microscopic systems in the lab.

Which preparatin nprocedure determines the quantum state of the solar system? How is the result of a measurement of some observables (say the mass of the Sun and the major planets) of this quantum state described from first principles (assuming Newonian gravity, which is fully adequate for this situation)?

The solar system is not coupled to an external measurement device as in the usual analysis of measurements.; the measurement is done from within. Without an explanation how this works, even ordinary quantum mechanics is an incompletely understood (and indeed incomplete) theory.

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving