A Smolin: Realistic and anti-realistic interpretations of QM

  • A
  • Thread starter Thread starter Auto-Didact
  • Start date Start date
  • Tags Tags
    Interpretations Qm
Auto-Didact
Messages
747
Reaction score
558
TL;DR Summary
Lee Smolin has a new book out called "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum". It is a book on the foundations of QM. For a brief review of the first half see: https://www.physicsforums.com/threads/what-are-you-reading-now-stem-only.912884/post-6176357

SPOILER WARNING

What I believe we can directly take away from this book with regard to discussions on QM foundations is I think his classification of interpretations of QM.
SPOILER WARNING: If you don't want to be spoiled about the book, stop reading now!



I just finished the new book. First I would recommend the book to anyone who reads or takes part in discussions on QM foundations. Briefly put, Smolin offers a simple classification of almost all interpretations of QM and their pros and cons with respect to a more fundamental theory than QM or QFT. The classification he gives for the interpretation of QM is realist vs anti-realist, where realism is the view of reality which all scientific theories of physics (except for QM) adhere to, and where anti-realism is essentially the instrumentalist view of QM and science at large as propagated by Bohr and Heisenberg. Suffice to say, the standard textbook operationalist view of QM is also anti-realist.

Realism on the other hand branches off into a few more specific views. The most important of these are what Smolin calls naive realism, magical realism and critical realism. Each of these branches consist of groups of theories which are fundamentally conceptually similar to each other, i.e. they have the same strengths and weaknesses.

Naive realism gives several options about what is real: either both particles and waves are real (pilot wave theory), only waves are real (collapse models) or only particles are real (Nelson's stochastic mechanics).

Magical realism has a few exemplifying interpretations, most importantly Everett's Many Worlds Interpretation. This interpretation is less predictive than QM because it literally predicts everything happens based on deterministic unitary evolution alone and therefore has a problem introducing probabilities.

Critical realism contains again a few exemplifying options: the Oxford interpretation, which can best be summarized as 'decoherence solves the issues with Everett's MWI.' In general, the way in which the problems of the MWI are attempted to be solved do not seem to work for several reasons.

Smolin in the book reviews all of the above interpretations, impartially gives both their merits and drawbacks, and more importantly explains what each of them teaches us about physics and what a successful realist completion of QM would need to be capable of achieving in order to become the theory to dethrone QM, i.e. simultaneously reproducing each of the successes of all of the interpretations while avoiding all of their problems.

Smolin then gives an outline how to achieve such a project based on a first principles approach similar to Lucien Hardy's approach for doing foundational physics. He names and describes a set of principles which a realist completion of QM and therefore a theory beyond QM needs to adhere to. From memory I think this approach is based largely on Einstein's philosophy of physics which clearly illustrates the difference between constitutive theories and principle theories.

The rest of the book illustrates a few specific implementations of research done so far which actually complete QM and go beyond it, i.e. which have actually reproduced all the successes of many of the interpretations so far. These projects are each very impressive in their own right, but - as Smolin describes - while they may reproduce some or all of the successes, they do not necessarily avoid all the problems.

Naturally, anyone who wants clarification should read the book, I'd recommend it to anyone interested in QM, regardless of their level of expertise. Again any further discussion in this thread will necessarily go into more depth than this post and spoil the book even more.
 
  • Like
Likes nnunn, DennisN, eloheim and 3 others
Physics news on Phys.org
I'm wondering if among these realist or anti-realist options also Berkeley's idealism (or sort of) has ever been considered?
 
Well, I'd consider myself a realist but for me what's usually called "realist" in the context of quantum interpretations is the opposite of what I consider realistic.

Realistic for me is what can be objectively observed and even (more or less precisely) quantified, and as the discovery of QT has taught us that's very different from what's usually considered as "realistic interpretation", i.e., what these people call "realistic interpretation" (or more precisely "ontic interpretation") is the opposite of what's really observed, namely that indeed everything works as in the "non-realistic" (minimal) standard interpretation, namely that with a complete determination of the quantum state (i.e., the preparation of a system in a pure quantum state, represented by a statistical operator of the form ##\hat{\rho}(o_1,\ldots,o_n)=|o_1,\ldots,o_n \rangle \langle o_1,\ldots,o_n|## only some observables, particularly the chosen set of observables ##O_1,\ldots,O_n## that form a complete set of compatible observables, i.e., define uniquely a one-dimensional common eigenspace, to which ##\hat{\rho}## projects, i.e., ##|o_1,\ldots,o_n \rangle## is a normalized common eigenvector of the self-adjoint operators ##\hat{O}_1,\ldots,\hat{O}_n##, representing the observables.

Then of course any observable ##O'## that is not compatible with the set ##O_1,\ldots,O_n## of observables (almost always) doesn't take a determined value, but the state preparation implies (and only implies!) the probability to find a certain value when measuring ##O'##.

This for me is what's realistic, because that's what's really observed in the lab. The very confusing language of philosophers and also some physicists however calls this empirically very well established observation as "un-realistic".

For me, there are no problems with this minimal interpretation of quantum theory and no necessity for any adjustment to it (let alone QT in its minimal form itself), as long as there's no clear empirical contradiction with its predictions.

Concerning the socalled "wave-particle dualism", that seems to me a horse as dead as it can be. It was resolved as non-existent already in 1926 when Born came up with the probabilistic interpretation of the quantum formalism. There's wimply no wave-particle dualism only the (in my opinion also very well established fact) that there exist neither classical particles nor classical waves but just a probabilistic description of nature in terms of quantum (field) theory.

The phenomenon that we observe many things around us as behaving like classical particles (in fact macroscopic extended bodies consisting of very many particles) and some like classical fields (mostly electromagnetic waves in form of light) is well understood from many-body quantum (field) theory as well.
 
  • Like
Likes bhobba, eys_physics, Dale and 1 other person
vanhees71 said:
Well, I'd consider myself a realist but for me what's usually called "realist" in the context of quantum interpretations is the opposite of what I consider realistic.
Personal interpretations are a large part of the problem in discussing foundational issues: definitions simply aren't up for grabs. There is a vast literature on the subject.
vanhees71 said:
Realistic for me is what can be objectively observed and even (more or less precisely) quantified
That is a personal philosophy of science which only works in a pragmatic sense, it also has a name and Smolin describes it carefully; making an equivocation between being observable and being real is a philosophy called 'instrumentalism' within the literature. In the most general context, i.e. logically, what can be observed need not be what is real; in the literature that what can be real is called a 'beable'.

Instrumentalism is a strategy which has short-term benefits, e.g. making experimental analysis, engineering and technological advancement possible, while long-term it tends to be purely counterproductive for the science in question itself, in this case physics.
vanhees71 said:
For me, there are no problems with this minimal interpretation of quantum theory and no necessity for any adjustment to it (let alone QT in its minimal form itself), as long as there's no clear empirical contradiction with its predictions.
The problem is again not a personal one nor an applied one, but an academic one; there is a well known problem called the measurement problem of QM. This is arguably the biggest and most longstanding problem in modern physics.

More specifically this problem is a fundamental problem in mathematical theoretical physics, because it introduces purely logical consistency problems in giving a completely mathematically self-consistent definition of what a physical theory is itself.

The measurement problem of QM resides not within the predictions of the theory of QM/QFT but within the change of the mathematical character of physical theory itself, i.e. essentially if QT is the final fundamental theory then the inherent properties of the laws of physics do not adhere to the mathematical structure of differential equations and inherently related concepts anymore.

In other words, the very existence of the Born Rule isn't necessarily problematic. Instead, what is problematic is that in addition to differential equations there are other mathematical objects, objects which are purely mathematically fundamentally inconsistent with differential equations, which lay claim to being essential to the properties of what a law of physics is. This inconsistency marks the embodiment of what the measurement problem entails.

Accepting this inconsistency as 'non-problematic' - as most physicists have done since Bohr and Heisenberg - is a cavalier decision which marks the very degradation of the historical endeavour of physics from a modern science into a distinctly postmodern science; most physicists which make this decision don't realize that this is what they are doing.

Let me spell this acceptance out explicitly: the central objects of fundamental physical theory - namely the laws of physics - are not merely allowed (out of temporary inconvenience) to be a combination of mathematically inconsistent objects for the time being which may be unified given more knowledge, but are instead accepted and maintained to necessarily be a combination of mathematically inconsistent objects in exactly the manner they are known for all time forward regardless of new discoveries, purely for the short-term advantages that this viewpoint seems to bring with it as long as all else is carefully ignored.
vanhees71 said:
Concerning the socalled "wave-particle dualism", that seems to me a horse as dead as it can be. It was resolved as non-existent already in 1926 when Born came up with the probabilistic interpretation of the quantum formalism.
This misses the mark because Smolin says nothing whatsoever about wave-particle duality, which I think we both agree is a completely misguided effort. Instead he (and I) use wave as shorthand for wavefunction.
 
  • Skeptical
  • Like
Likes Dale and nnunn
I don't understand, why you think, it's a problem if physical laws are not written in terms of differential equations. Then also classical kinetics is already problematic, because it's a integro-differential equation, or the Langevin equation which is a stochastic (integro-)differential equation.

I don't see the socalled measurement problem as a physical theorem but the lack of aconsistent QT of gravity.At this point it may well be that we need a new theory, and I don't claim that we have a "final theory" before this new theory has been found.
 
  • Like
Likes bhobba and Auto-Didact
vanhees71 said:
This for me is what's realistic, because that's what's really observed in the lab. The very confusing language of philosophers and also some physicists however calls this empirically very well established observation as "un-realistic".

This is anti-realist in the sense above because your interpretation does not treat the observer as a physical system subject to the same mathematical treatment/physical theory as the world he observes, eg your observer is not a factor of a tensor product Hilbert space.

You can say its unfair to call this "realist" if you want. Indeed in other non-ontology conversations, like in political theory for example, "realist" means "very practical and unambitious" which would suggesta parallel in sticking to only strictly empirical claims in science. But, this is just the normal lack of clarity of English, and it is a losing battle to try to redefine terms of art, well established in a specific debate.

Even worse, "realism" means yet something else in the old EPR-Bell argument. It means something closer to counterfactual definiteness, and in this sense MWI is not realist (though still realist per Smolin's also common usage)
 
  • Like
Likes Auto-Didact
Thanks for the recommendation. I reserved a copy of Smolin's book for pickup this week. The description and subsequent discussion in no way spoils my anticipation, rather the opposite.
 
  • Like
Likes Auto-Didact
vanhees71 said:
I don't understand, why you think, it's a problem if physical laws are not written in terms of differential equations. Then also classical kinetics is already problematic, because it's a integro-differential equation, or the Langevin equation which is a stochastic (integro-)differential equation.
I am simplifying by saying differential equations, more true would be to say:
Auto-Didact said:
differential equations and inherently related concepts
The mathematical object which captures the properties of a law of physics most accurately are differential equations from the still incomplete theory of differential equations in pure mathematics.

Direct specifications and generalizations of differential equations - such as ODEs, PDEs, SDEs and integro-DEs - as well as indirect specifications, generalizations and abstractions - such as derivatives, difference equations, linear operators and so on - all share the same common features as differential equations, in that they can be viewed as being inherently representative of this overall conceptual notion which we associate with analysis.

Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory which naturally unifies algebraic geometry, complex analysis, algebraic topology, Riemannian geometry, the theory of Riemann surfaces, the theory of dynamical systems, renormalization group theory, the theory of modular forms and so on - in a manner similar to the Langlands program.

This novel unified mathematical theory is thought to be the true underlying mathematical theory behind physics, and would also explain why physical theories have the structure that they do, why approximative techniques in physics work at all, why physical theories generalize in the unique way that they do - essentially explain 'the unreasonable effectiveness of mathematics in the natural sciences' as Wigner put it. QT, with QFT seemingly being its ultimate mathematical form, for all it does, clearly does nothing of the sort.

By the way, an explicit implementation of this still unknown unified theory of mathematics is conjectured to already exist and be known to almost everyone: it is called string theory, and it represents an explicit unification of all those fields in mathematics above and more, based mostly on K theory and the theory of Riemann surfaces. Unless a physical theory in its mature stage is capable of doing something like this for pure mathematics, there is no reason to place any hope in a theory as being a fundamental theory of physics.

Because string theory today doesn't seem to be what physicists and mathematicians once thought it was, and some still think it is, that will mean at worst it is merely equivalent to QFT per e.g. AdS/CFT.
vanhees71 said:
I don't see the socalled measurement problem as a physical theorem but the lack of aconsistent QT of gravity.At this point it may well be that we need a new theory, and I don't claim that we have a "final theory" before this new theory has been found.
I don't see it as a physical theorem either, and the current lack of a consistent theory of QG as a symptom that QT as is must be incomplete, i.e. not being the final theory, so we agree here then.
 
Auto-Didact said:
Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory
Is this your conjecture or Smolin's? Surely it is not a fact.
 
  • Like
Likes vanhees71
  • #10
vanhees71 said:
I don't see the socalled measurement problem as a physical theorem but the lack of a consistent QT of gravity.
The measurement problem appears already when treating the solar system as a quantum system, with a classical external gravitational potential, since all measurements we know of are done from wihin the solar system.
 
  • #11
A. Neumaier said:
Is this your conjecture or Smolin's? Surely it is not a fact.
Neither mine nor Smolin. It is a recurrent theme among the writings of different physicists and mathematicians, scattered across the centuries. Most explicitly it has been mentioned by physicists and mathematicians such as Wigner, Poincaré, Weyl, Penrose, Atiyah, Frenkel, Witten, Strogatz and 't Hooft scattered across their many works. The Langlands program is aligned with this old historical goal of mathematical theoretical physics.

The general theme that physical theory is our best bet for discovering the most beautiful and farreaching theories of pure mathematics is a recurring theme in the practice of theoretical physics since Newton onwards. Physics as a discipline, from this point of view is a natural amalgamation of natural philosophy and mathematics, in which the canonical concepts of physics and classical applied mathematics represent the most clean models of empirical concepts.

It is only recently that physicists and pure mathematicians stopped believing in this, i.e. for mathematicians after the rise of formalism under Hilbert and for physicists after the subsequent championing of QT by Bohr et al as a distinctly non-realistic science. The untimely death of Poincaré, i.e. the only philosopher of physics of sufficient calibre with enough breadth and depth of knowledge of these issues at the time, capable of adequately explaining these issues, practically guaranteed the divorce between mathematics and physics. This marriage has since only been rekindled partially through the discovery of the theory of dynamical systems and the string theory revolutions.
 
  • #12
Auto-Didact said:
Summary: Lee Smolin has a new book out called "Einstein's Unfinished Revolution: The Search for What Lies Beyond the Quantum". It is a book on the foundations of QM. For a brief review of the first half see: https://www.physicsforums.com/threads/what-are-you-reading-now-stem-only.912884/post-6176357

Smolin is a gravity guy. Do you know of similar theme books that explores say the realist and anti-realist interpretations of general relativity?
SPOILER WARNING

What I believe we can directly take away from this book with regard to discussions on QM foundations is I think his classification of interpretations of QM.

SPOILER WARNING: If you don't want to be spoiled about the book, stop reading now!



I just finished the new book. First I would recommend the book to anyone who reads or takes part in discussions on QM foundations. Briefly put, Smolin offers a simple classification of almost all interpretations of QM and their pros and cons with respect to a more fundamental theory than QM or QFT. The classification he gives for the interpretation of QM is realist vs anti-realist, where realism is the view of reality which all scientific theories of physics (except for QM) adhere to, and where anti-realism is essentially the instrumentalist view of QM and science at large as propagated by Bohr and Heisenberg. Suffice to say, the standard textbook operationalist view of QM is also anti-realist.

Realism on the other hand branches off into a few more specific views. The most important of these are what Smolin calls naive realism, magical realism and critical realism. Each of these branches consist of groups of theories which are fundamentally conceptually similar to each other, i.e. they have the same strengths and weaknesses.

Naive realism gives several options about what is real: either both particles and waves are real (pilot wave theory), only waves are real (collapse models) or only particles are real (Nelson's stochastic mechanics).

Magical realism has a few exemplifying interpretations, most importantly Everett's Many Worlds Interpretation. This interpretation is less predictive than QM because it literally predicts everything happens based on deterministic unitary evolution alone and therefore has a problem introducing probabilities.

Critical realism contains again a few exemplifying options: the Oxford interpretation, which can best be summarized as 'decoherence solves the issues with Everett's MWI.' In general, the way in which the problems of the MWI are attempted to be solved do not seem to work for several reasons.

Smolin in the book reviews all of the above interpretations, impartially gives both their merits and drawbacks, and more importantly explains what each of them teaches us about physics and what a successful realist completion of QM would need to be capable of achieving in order to become the theory to dethrone QM, i.e. simultaneously reproducing each of the successes of all of the interpretations while avoiding all of their problems.

Smolin then gives an outline how to achieve such a project based on a first principles approach similar to Lucien Hardy's approach for doing foundational physics. He names and describes a set of principles which a realist completion of QM and therefore a theory beyond QM needs to adhere to. From memory I think this approach is based largely on Einstein's philosophy of physics which clearly illustrates the difference between constitutive theories and principle theories.

The rest of the book illustrates a few specific implementations of research done so far which actually complete QM and go beyond it, i.e. which have actually reproduced all the successes of many of the interpretations so far. These projects are each very impressive in their own right, but - as Smolin describes - while they may reproduce some or all of the successes, they do not necessarily avoid all the problems.

Naturally, anyone who wants clarification should read the book, I'd recommend it to anyone interested in QM, regardless of their level of expertise. Again any further discussion in this thread will necessarily go into more depth than this post and spoil the book even more.
 
  • #13
vanhees71 said:
Realistic for me is what can be objectively observed and even (more or less precisely) quantified

That is the opposite of realism.
 
  • Like
Likes Auto-Didact
  • #14
I am sorry but I do take issue with bringing the Langlands program in this. What does it have to do with any of this! (By the way don't give me references to the geometric Langlands program, that is not the Langlands program)
Auto-Didact said:
Essentially all these specifications and generalizations of differential equations are 'shadows' of a deeper undiscovered purely mathematical theory which naturally unifies algebraic geometry, complex analysis, algebraic topology, Riemannian geometry, the theory of Riemann surfaces, the theory of dynamical systems, renormalization group theory, the theory of modular forms and so on - in a manner similar to the Langlands program.
Can you be more specific and give quotes or references where anyone has alluded to this deeper mathematical theory. To me it sounds like your own personal wishful thinking.
 
  • #15
Just to make my point of view clear. I don't deny that there may be some philosophical problems with irreducible probabilistic elements in the foundations, but I think they are philosophical problems and not physical ones, and I don't think that their resolution contributes anything to physics.

Rather, a real big problem is the incompatibility of general relativity (or any other relativistic theory of gravitational interactions) with QT. I think if somebody came up with a solution of this problem there could be real progress of physics.

However you call my standpoint (maybe its instrumentalist, I don't care), that's how QT is really used in physics labs, and so far if you stay really within the realm of physics there's no problem with it. To the contrary, although there's a high interest of the high-energy particle-physics community to find "physics beyond the standard model", so far the standard model withstands all tests and that although there are indeed some intrinsic mathematical issues with QFT as a mathematical theory. Maybe these mathematical problems are also related to the question whether there is a working quantum theory of gravitation, and maybe the solution of one problems contributes to the other.

The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.
 
  • Like
Likes Demystifier
  • #16
vanhees71 said:
The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.
 
  • #17
cube137 said:
Smolin is a gravity guy. Do you know of similar theme books that explores say the realist and anti-realist interpretations of general relativity?
General relativity, in contrast to QT, doesn't require a discussion of its interpretation since GR's foundations aren't a huge mess, quite the opposite; as a consequence I don't necessarily have a similar book to offer you for gravity. Perhaps Carlo Rovelli's Quantum Gravity or Smolin's prior book Time Reborn?
martinbn said:
I am sorry but I do take issue with bringing the Langlands program in this. What does it have to do with any of this!
The Langlands Program can be seen as a 'grand unification of mathematics'; the unified mathematical theory I am referring to has the same property, but perhaps more modest in how much of mathematics it is aspiring to unify.

Such unification occurs naturally by implying the existence of a single consistent underlying mathematical theory behind advanced mathematical physics models, in particular advanced models which are capable of subsuming all of known physics.

For illustrative purposes, GR was once such an advanced mathematical physics model which had an underlying unified mathematical theory, Riemannian geometry, which of course unifies geometry, analysis and topology in unforeseen ways. This of course isn't the historical route of how Riemannian geometry was discovered but it would be foolhardy to claim such a route would've been absolutely impossible.
martinbn said:
Can you be more specific and give quotes or references where anyone has alluded to this deeper mathematical theory. To me it sounds like your own personal wishful thinking.
Just a handful of examples:
Andre Weil: "The analogies that Dedekind demonstrated were easy to understand. For integers one substituted polynomials in x, to the divisibility of integers corresponded the divisibility of polynomials (it is well known, and it is taught even in high schools, that there are other such analogies, such as for the derivation of the greatest common divisor), to the rationals correspond the rational fractions {[?of polynomials, or the rational functions]}, and to algebraic numbers correspond the algebraic functions. At first glance, the analogy seems superficial; to the most profound problems of the theory of numbers (such as the decomposition into prime ideals) there would seem to be nothing corresponding in algebraic functions, and inversely. Hilbert went further in figuring out these matters; he saw that, for example, the Riemann-Roch theorem corresponds to Dedekind’s work in arithmetic on the ideal called “the different”; Hilbert’s insight was only published by him in an obscure review (Ostrowski pointed me to it), but it was already transmitted orally, much as other of his ideas on this subject. The unwritten laws of modern mathematics forbid writing down such views if they cannot be stated precisely nor, all the more, proven. To tell the truth, if this were not the case, one would be overwhelmed by work that is even more stupid and if not more useless compared to work that is now published in the journals. But one would love it if Hilbert had written down all that he had in mind. Let us examine this analogy more closely. Once it is possible to translate any particular proof from one theory to another, then the analogy has ceased to be productive for this purpose; it would cease to be at all productive if at one point we had a meaningful and natural way of deriving both theories from a single one. In this sense, around 1820, mathematicians (Gauss, Abel, Galois, Jacobi) permitted themselves, with anguish and delight, to be guided by the analogy between the division of the circle (Gauss’s problem) and the division of elliptic functions. Today, we can easily show that both problems have a place in the theory of abelian equations; we have the theory (I am speaking of a purely algebraic theory, so it is not a matter of number theory in this case) of abelian extensions. Gone is the analogy: gone are the two theories, their conflicts and their delicious reciprocal reflections, their furtive caresses, their inexplicable quarrels; alas, all is just one theory, whose majestic beauty can no longer excite us. Nothing is more fecund than these slightly adulterous relationships; nothing gives greater pleasure to the connoisseur, whether he participates in it, or even if he is an historian contemplating it retrospectively, accompanied, nevertheless, by a touch of melancholy. The pleasure comes from the illusion and the far from clear meaning; once the illusion is dissipated, and knowledge obtained, one becomes indifferent at the same time; at least in the Gitâ there is a slew of prayers (slokas) on the subject, each one more final than the previous ones."

Roger Penrose: the unified mathematics behind twistor theory, described in among others The Road to Reality.
Michael Atiyah: Arithmetic Physics described in a few of his papers and lectures
Edward Frenkel: the Langlands Program, described in talks papers and a popular book.
John Baez: the unifying role of Category Theory, described across dozens of posts, papers and blogs.

Also, I am not saying that all these unification programmes are the same thing, merely that they are striving towards similar goals.
Demystifier said:
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.
Making such a suggestion is de facto a philosophical step; this insight is what vanhees is missing.
A. Neumaier said:
Is this your conjecture or Smolin's? Surely it is not a fact.
Perhaps I am giving too much credit to DEs; as I have mentioned before what I'm trying to describe is my intuition not some formal definition of DEs, dynamical systems or whatnot; perhaps simply saying functions or maps is sufficient? I doubt it, because these definitions don't necessarily capture the structure of the deeper theory I am describing, in that such descriptions seem way too vague, arbitrary, general and/or abstract.

Whatever the deeper mathematical theory underlying the structure of the as of yet unknown more fundamental physical laws turns out to be, it is certainly natural - even rational - to assume that they will be capable of being stated in a conceptually coherent manner with respect to the manner in which canonical physical theory has progressed so far, namely deterministically, with DEs or some generalization thereof as the core concept.
 
Last edited:
  • #18
vanhees71 said:
Rather, a real big problem is the incompatibility of general relativity (or any other relativistic theory of gravitational interactions) with QT. I think if somebody came up with a solution of this problem there could be real progress of physics.
So you say that there is nothing to fix about general relativity and there is nothing to fix about QT, the only problem is that they do not stick together and if somebody could just fix this not-sticking-together problem. Is this right?
vanhees71 said:
The only thing I doubt is that work on philosophical issues with the foundations of QT lead to profound new physics.
Of course philosophy can not give you new physics. I see your doubts are completely justified.
But you are missing the thing that philosophy can do. It can help to get rid of the junk clearing the space for new physics.
 
  • Like
Likes nnunn and Auto-Didact
  • #19
Demystifier said:
My approach to Bohmian mechanics (the link in my signature) suggests also how to search for new physics. More specifically, it suggests that more fundamental theory is not relativistic QFT.
Yes, sure, but that's not entirely just philosophy. I also don't think that relativistic QFT is "the final theory". It's not even mathematically completely consistent. I think Weinberg is right in writing that QFT is an effective theory in any case. Whether or not there is a more comprehensive theory, from which QFT can be deduced as an effective theory, I can't say either.

Looking at the history of physics as we understand it (it physics beginning with, say, Kepler, Galilei, and Newton) I don't think that the fundamental issues can be solved by theory alone, but we need more empirical input, where Q(F)T unanimously fails. E.g., to get an idea of how a consistent quantum theory of gravitation might look like, we'd need some gravity-related phenomenon clearly contradicting classical-field-theory like behavior. That's of course very difficult to achieve, because gravity is so weak compared to the other interactions that it only leads to clearly observable effects on macroscopic systems which are almost always not sensitive to specific quantum effects.
 
  • Like
Likes Demystifier
  • #20
zonde said:
So you say that there is nothing to fix about general relativity and there is nothing to fix about QT, the only problem is that they do not stick together and if somebody could just fix this not-sticking-together problem. Is this right?

Of course philosophy can not give you new physics. I see your doubts are completely justified.
But you are missing the thing that philosophy can do. It can help to get rid of the junk clearing the space for new physics.
I don't believe it's that simple. Otherwise somebody would have done it. So far, all observations concerning gravity are in accord with GR, but that's likely to be the case, because all we can observe concerning gravity is about its action on macroscopic systems, and there the classical theory is very accurate (in close analogy to our everyday experience that classical electrodynamics/optics is very accurate although here we know QED as the underlying quantum theory). So it's very hard to find specific phenomena where and how GR (or maybe some other classical field theory describing gravitation better, although I don't know of any clear empirical hint that this might be the case) has to be joined with QT.

I don't think that there's anything wrong with QT in the realm where we really need it to describe the phenomena. It's not consistingly describing the gravitational interactions, and that's imho the only clear physical hint of failure of QT or rather its incompleteness. What I don't consider real physics problems are these debates about its probabilistic nature and apparently "weird" quantum phenomena like entanglement. We are simply unused to them in our everyday experience, but as weird as they may seem to our in quantum matters quite untrained experience, has always been found to be correct:

E.g., where QT says we don't know more than the probabilistic content it provides, we have failed to find a deterministic theory describing the phenomenon: E.g., so far we don't have any idea how to precisely predict when a specific radioactive nucleus decays. We know what causes the decay (e.g., if it's a ##\beta## emitter the weak interaction), but the best-working theory (quantum flavor dynamics in this case) cannot provide a precise predicts at which time the decay occurs. It only gives a "mean lifetime" of the nucleus, i.e., a probabilistic notion. So far we don't have any hint of whether there is some deterministic explanation for the precise time the decay occurs. As long as this is not the case, I don't see why there's a problem with this probabilistic nature of quantum theory. The same holds for the so-called "measurement problem". I think there's no measurement problem, because QT accurately describes even the most precise measurements modern technology enables us to achieve, including 100% correlations of completely random properties as described by entanglement in the very highly significant sense of contemporary Bell experiments of various types. Although there seems to be not the slightest hint to a deterministic description. Whether there is none, of course, I cannot say. Maybe one day somebody comes up with a very clever determinstic non-local theory describing all facts as well as QT does today in its probabilistic sense, but I doubt that we will find it by pondering purely philosophical quibbles about the so-called "measurement problem" or other "foundational problems" of a purely philosophical nature. I think, as with the question about a consistent QT of gravity, we'd need some empirical evidence clearly indicating that there's a real problem in describing an unanimously observed phenomenon which contradicts QT.
 
  • #21
vanhees71 said:
So far we don't have any hint of whether there is some deterministic explanation for the precise time the decay occurs. As long as this is not the case, I don't see why there's a problem with this probabilistic nature of quantum theory. The same holds for the so-called "measurement problem". I think there's no measurement problem, because QT accurately describes even the most precise measurements

The measurement problem has nothing to do with the precision of measurements or with the need for classically deterministic predictions. It is a question of an apparent logical contradiction in the traditional axioms of quantum theory. In my opinion, the best statement of the problem is courtesy of David Wallace:

"We cannot consistently understand the state space of quantum theory either as a space of physical states, or as a space of probability distributions. Instead, we have to use one interpretation for microscopic physics and another for macroscopic physics. Furthermore, both the point at which we have to transition between the physical and probabilistic interpretation, and the basis with respect to which the probabilistic interpretation is to be specified, are defined only in an approximate, rough-and-ready way, which seems to make essential use of terms like “macroscopic” which have no place in a fundamental physical theory."

https://arxiv.org/abs/1111.2187
The "realist" interpretations - many worlds, hidden variables, and objective (GRW) collapse -try to find a way to consistently say quantum states are always physical.

The "nonrealist" interpretations - QBism and Copenhagen understood correctly - try to find a way to say quantum states are always probability distributions.
 
  • Like
Likes stevendaryl and Auto-Didact
  • #22
But the quote is obviously wrong, because we very well can use quantum theory to describe real-world experiments, and there's both the notion of the state, described by the statistical operator, its deterministic (!) time evolution, given the Hamiltonian of the system, and its probabilistic interpretation. The state is determined by a preparation procedure, and it implies that not all observables of the system takes determined values, but a measurement of these observables give random results with probabilities given by the states. There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism. That's why there is no physical problem with quantum mechanics in this sense.

As I stressed, I don't believe that QT is "the final theory", but the problem are not the philosophical quibbles usually discussed in these debates about "interpretation" but real physics problems as a lack of understanding of how gravitation is described consistently with QT.
 
  • #23
vanhees71 said:
The state is determined by a preparation procedure, and it implies that not all observables of the system takes determined values, but a measurement of these observables give random results with probabilities given by the states. There's no contradiction in the sense of logic nor in the empirical evidence for this probabilistic interpretation of the formalism

What is the nature of this indeterminacy you mention? Is it that the system is physically/ontologically smeared across its possible values, so the value exists in an unsharp way? Is it that the value sharply in an eigenstate, and we just can't say which until measurement? Or is the value truly non-existent until it is measured?
 
  • #24
vanhees71 said:
What I don't consider real physics problems are these debates about its probabilistic nature and apparently "weird" quantum phenomena like entanglement. We are simply unused to them in our everyday experience, but as weird as they may seem to our in quantum matters quite untrained experience, has always been found to be correct:
This is actually an old and tired argument that uncritical teachers tell their freshmen students in order not to make them run away screaming. I remember making the exact same claim to someone else that you are making right here during my days as a naive undergraduate, i.e. before I was properly exposed to the history or philosophy of physics.
vanhees71 said:
Maybe one day somebody comes up with a very clever determinstic non-local theory describing all facts as well as QT does today in its probabilistic sense, but I doubt that we will find it by pondering purely philosophical quibbles about the so-called "measurement problem" or other "foundational problems" of a purely philosophical nature.
Conceptual matters are always solved by making a philosophical leap against "what is known to be true", i.e. by taking another conceptual stance against some dominant viewpoint and then making this new stance mathematically intricate. Turning a conceptual stance into a mathematically intricate picture is particular to physics, but changing conceptual stances is a purely philosophical endeavor which occurs across all forms of specialized and unspecialized human reasoning.

For example, Newton directly chose to contradict millenia of scientific teachings about physics not because of the results of experiments, but because he deeply believed everyone was wrong and he was right and he could back up his own ideas by constructing a new mathematicized explanation of the world, superior to the old explanation; in order to carry this project out he had to invent a new form of mathematics to embody his own philosophical conceptualizations (NB: of course today we know, he wasn't right, he was merely 'less wrong').

Similarly, Einstein chose directly to contradict the Newtonian conceptualization of the world with that of his own basing it on new unproven mathematics which he had little experience with and which had no known academic precedent nor application in science, let alone in fundamental physics. This of course while all scientists at the time "knew" that Newton was correct, in an essentially uncritical and almost gullible uninquisitive way of blindly reasoning about what one already knows while being willfully ignorant against all new conceptualizations of the world; this cognitive bias is in no way particular to early 20th century physicists.

Today we are again in the situation that all scientists "know" that QT is correct, and they cook up all kinds of frankly irrelevant justifications (e.g. measurement precision arguments or quoting the wide range of experiments performed so far). Yet they seem to fail to realize that the mathematics underlying this theory - this conceptualization - has serious (self-)consistency errors and therefore are problematic from a non-formalist pure mathematics point of view; this has nothing to do with being constructable from axioms, or being definable using some semi-abstract mathematics.

The inconsistency of the theory instead follows from the conceptualization being poorly understood, leading to this conceptualization being equally poorly mathematicized with limitations built in, i.e. forcefully mathematicized within a very definite but approximative limited artificial mathematical framework. It is this limited approximate artificial mathematical framework which prevents QT from being capable of being unified with another conceptualization, GR, because GR is in contrast conceptually extremely coherent and therefore its underlying mathematics is exquisitely clean and completely natural.
vanhees71 said:
I think, as with the question about a consistent QT of gravity, we'd need some empirical evidence clearly indicating that there's a real problem in describing an unanimously observed phenomenon which contradicts QT.
That would be nice, but certainly isn't a necessity. Theory usually comes before the experiment, with a myriad of empirically indistinguishable explanations to choose from until some key experiments clearly favor one particular theory. Momentarily we aren't in the stage yet for such experiments to be carried out, but most experts agree we might be within a decade or two of being capable of falsifying scientifically legitimate alternatives to QT which are indistinguishable within the current range of experimental validity.
 
  • Like
Likes nnunn
  • #25
vanhees71 said:
The state is determined by a preparation procedure
This only covers states of microscopic systems in the lab.

Which preparatin nprocedure determines the quantum state of the solar system? How is the result of a measurement of some observables (say the mass of the Sun and the major planets) of this quantum state described from first principles (assuming Newonian gravity, which is fully adequate for this situation)?

The solar system is not coupled to an external measurement device as in the usual analysis of measurements.; the measurement is done from within. Without an explanation how this works, even ordinary quantum mechanics is an incompletely understood (and indeed incomplete) theory.
 
  • Like
Likes Auto-Didact
  • #26
vanhees71 said:
I also don't think that relativistic QFT is "the final theory". It's not even mathematically completely consistent.
No inconsistency has been proved. It is simply unknown whether QED or the standard model are consistent or not. Settling this might have important consequences for the methods to squeeze out predictions from QFT, and hence is at least as important as finding a unified theory of the standard model plus gravity - which is unlikely to have any significant experimental consequences, as you say yourself:
vanhees71 said:
So far, all observations concerning gravity are in accord with GR, but that's likely to be the case, because all we can observe concerning gravity is about its action on macroscopic systems, and there the classical theory is very accurate (in close analogy to our everyday experience that classical electrodynamics/optics is very accurate although here we know QED as the underlying quantum theory). So it's very hard to find specific phenomena where and how GR (or maybe some other classical field theory describing gravitation better, although I don't know of any clear empirical hint that this might be the case) has to be joined with QT.
 
  • Like
Likes vanhees71
  • #27
vanhees71 said:
Yes, sure, but that's not entirely just philosophy.
I'm glad that you think so. But I would like to stress that I arrived to those ideas by starting from philosophical questions. The point is that philosophy may lead to something that is more than philosophy, so one should not ignore ideas which at first look as just philosophy.
 
  • Like
Likes nnunn and Auto-Didact
  • #28
charters said:
What is the nature of this indeterminacy you mention? Is it that the system is physically/ontologically smeared across its possible values, so the value exists in an unsharp way? Is it that the value sharply in an eigenstate, and we just can't say which until measurement? Or is the value truly non-existent until it is measured?
What do you mean by this question? If somebody starts the question with "what's the nature/mechanism...", usually he or she has a conceptual misunderstanding what the natural sciences are methodologically aiming at. First of all they are empirical sciences, i.e., a phenomenon of nature is investigated by making quantitative observations about it. If the phenomenon is reproducible and shows a clear regularity, one can compare it with the predictions of existing models/theories, as far as one is able to apply the formal, mathematical structure of the model/theory to it (and that's all you need to make a mathematical system a physical theory, and that's it concerning interpretation as far as natural science is concerned). If then the observation agrees, within the accuracy of the observation, you have found "an explanation" in the sense that you can understand the phenomenon in terms of the existing models/theories.

In some sense, that's the "boring" case, because then we haven't learned something new about nature. So an experiment and theoretical analysis becomes interesting, if the observation deviates from what's expected from the existing theories. Then, usually a careful reaxamanation of the experimental setup starts, and one might figure out errors of this setup, or one tries to find variations of the measurement to see whether everything is reproducible with other methods. If the deviation with theory withstands all these careful testing, then a new model/theory is needed, and then some new model/theory is developed, with the caveat that the new theory has to work in all the cases the old theory worked before. If this is established, usually one can understand the success of the old theory as well as its failure in "explaining" the new phenomenon in the sense that the old theory can be understood as some approximation of the new theory, valid only in a certain realm and not applicable where the old theory failed.

This is of course a quite complicated mutual process, i.e., theories do not only "explain" in the above summarized sense a phenomenon or have to be modified due to a newly discovered phenomenon where it fails, but it also provides ideas for more experiments. E.g., quantum theory all of a sudden made it interesting to investigate how electrons behave when shot on a double slit. Before, when everybody thought electrons are just little classical particles, this experiment may never have been realized, because nobody would have thought that it might lead to interesting new phenomena. With de Broglie's hypothesis of "wave-particle duality" this became interesting, and indeed it's among the most interesting early experiments leading to the development of modern quantum mechanics, which finally made the quite inconstistent idea of "wave-particle duality" obsolete and lead to the probabilistic standard interpretation.

As any mathematical model, also physical theories start with some postulates, which cannot be reduced further to "more simple" postulates. Here, of course, simplicity is a subjective idea, but what theoretical physicists aim at is to find a minimal set of fundamental postulates for a theory, from which as many as possible phenomena can be "explained" in the above sense. The "fundamental postulates" themselves cannot be "explained" in this sense, but are just compact summaries of the results of, sometimes decade-long, efforts of experimentalists and theorists.
 
  • #29
vanhees71 said:
I don't think that there's anything wrong with QT in the realm where we really need it to describe the phenomena. It's not consistingly describing the gravitational interactions, and that's imho the only clear physical hint of failure of QT or rather its incompleteness.
I'm not sure I understand your point here. There are no gravitational interactions according to GR. Inertia just bends spacetime. GR describes that. It's predictions agree with observations. What else do you want? Any aesthetic considerations should not be relevant to physics, right?
 
  • #30
A. Neumaier said:
No inconsistency has been proved. It is simply unknown whether QED or the standard model are consistent or not. Settling this might have important consequences for the methods to squeeze out predictions from QFT, and hence is at least as important as finding a unified theory of the standard model plus gravity - which is unlikely to have any significant experimental consequences, as you say yourself:
Well, I have some hope that solving the problem of a QT description of gravity might also solve the mathematical problems of QFT as a whole. E.g., some decades the (failed) hope was that string theory and its various relative might solve both the problem of quantum gravity and provide a mathematically consistent description of QFT, with the standard model as an approximation in the sense of an "effective theory", and as such it works pretty well, one could even say too well, because no clear contradicting observation has been made yet, despite its mathematical inconsistency.
 
  • #31
A. Neumaier said:
This only covers states of microscopic systems in the lab.

Which preparatin nprocedure determines the quantum state of the solar system? How is the result of a measurement of some observables (say the mass of the Sun and the major planets) of this quantum state described from first principles (assuming Newonian gravity, which is fully adequate for this situation)?

The solar system is not coupled to an external measurement device as in the usual analysis of measurements.; the measurement is done from within. Without an explanation how this works, even ordinary quantum mechanics is an incompletely understood (and indeed incomplete) theory.
The preparation procedure of the solar system is not too well known, but the common idea is that stars and planets etc. around them form out of clouds of some material which is denser at some location than on average, and then gravity does its job. All this is of course well described by classical (even Newtonian) physics. A complete microscopic description is neither possible nor necessary.

A simpler example is the coffee in the cup on my desk, which has been prepared by me just some minutes ago. It's pretty well described as a system of local thermal equilibrium slowly equilibrating further to finally reach the temperature of my office which is also pretty well described to be in local thermal equilibrium, providing a "heat bath" within which the coffee sits and exchanges energy and water molecules, i.e., it cries to be described as a grand canonical ensemble close the thermal equilibrium and thus with classical physics like (viscous) hydro dynamics with heat conduction etc.

So preparation procedures need not be very "artificial" as, e.g., at a collider like the LHC, where proton bunches are accelerated and kept at high precision at a certain energy to get something accurately prepared to make experiments, where the much more microscopic detail is resolved. Already substituting the protons by Pb nuclei and performing heavy-ion collisions changes this completely, and again there's no better chance to understand what's going on than to use (semi-)classical methods like kinetic theory, relativistic viscous hydro, Langevin processes, etc. to describe the system. Even in p Pb and pp collisions one observes quite some "collectivity", at least in "high-multiplicity events".

It's a gift and a curse of nature at once that macroscopic, or even "mesoscopic" systems of some 1000 of particles, tend to behave according to classical or semi-classical models. It's a gift, because we have the chance to understand more by describing these system approximately with simpler models but also a curse, because we cannot so easily observe the (maybe) interesting quantum phenomena we are after.
 
  • Like
Likes DanielMB
  • #32
vanhees71 said:
The preparation procedure of the solar system is not too well known, but the common idea is that stars and planets etc. around them form out of clouds of some material which is denser at some location than on average, and then gravity does its job. All this is of course well described by classical (even Newtonian) physics. A complete microscopic description is neither possible nor necessary.
But to say that a complete microscopic description is not possible says that quantum theory does not apply for the solar system as a whole. A complete microscopic quantum description should therefore be possible in principle, even though we may never know the exact details. And such a microscopic quantum description would represent a single system only, which cannot be interpreted by a purely statistical interpretation. This is the reason why interpretation questions are still open, and why they may shed light on what is missing for a fundamental description of all of Nature.
 
  • Like
Likes Auto-Didact
  • #33
vanhees71 said:
I have some hope that solving the problem of a QT description of gravity might also solve the mathematical problems of QFT as a whole.
I have the opposite hope that solving the mathematical problems of QFT might also solve the problem of a QT description of gravity. Mathematical problems always point to lack of theoretical understanding, and progress comes through fixing these conceptual issues.
 
  • Like
Likes eloheim and Auto-Didact
  • #34
A. Neumaier said:
But to say that a complete microscopic description is not possible says that quantum theory does not apply for the solar system as a whole. A complete microscopic quantum description should therefore be possible in principle, even though we may never know the exact details. And such a microscopic quantum description would represent a single system only, which cannot be interpreted by a purely statistical interpretation. This is the reason why interpretation questions are still open, and why they may shed light on what is missing for a fundamental description of all of Nature.
Well, why do you say, it's not possible in principle? The single system then is described in terms of probabilities. The more I think about it, the less I see anything problematic in this. Even on the classical level, the many-body system is described in terms of statistical physics and thus with probabilities, including fluctuations and all that, and that classical description can be derived, at least to some extent, from quantum many-body theory.
 
  • #35
Demystifier said:
I'm glad that you think so. But I would like to stress that I arrived to those ideas by starting from philosophical questions. The point is that philosophy may lead to something that is more than philosophy, so one should not ignore ideas which at first look as just philosophy.
Well, yes. Theory building is an art, and you can be inspired by many non-scientific things ;-)).
 
  • Like
Likes Demystifier
  • #36
vanhees71 said:
The single system then is described in terms of probabilities.
What do probabilities mean for a single system? How does this show in the postulates of quantum mechanics?
 
  • #37
zonde said:
I'm not sure I understand your point here. There are no gravitational interactions according to GR. Inertia just bends spacetime. GR describes that. It's predictions agree with observations. What else do you want? Any aesthetic considerations should not be relevant to physics, right?
Well, GR has built in its own failure, namely the unavoidable singularities of all physically relevant solutions. When it comes to the singularities of cosmology ("big bang") and black holes ("Schwarzschild, Kerr" of very compact objects), the physical laws of GR break down, and it's expected that an appropriate quantum treatment "cures" these deficiencies of the classical theory.

Further, there's nothing in GR that forbids to think about gravity as an interaction. The geometrization even can be derived from this ansatz (e.g., due to Weinberg; see also Feynman's book, "The Feynman lectures on gravitation", which is a brillant textbook on the subject).
 
  • #38
A. Neumaier said:
What do probabilities mean for a single system? How does this show in the postulates of quantum mechanics?
I answered this question a zillion times. We don't need to go into this issue again (I know, I shouldn't have answered to this thread...).
 
  • #39
I'm reading Smolin Einstein's Unfinished Revolutions and there is something that perflexed me. Quoting Smolin:

At the same time, there are several reasons pilot wave theory is not entirely convincing as a true theory of nature. One is the empty ghost branches, which are parts of the wave function which have flowed far (in the configuration space) from where the particle is and so likely will never again play a role in guiding the particle. These proliferate as a consequence of Rule 1, but play no role in explaining anything we’ve actually observed in nature. Because the wave function never collapses, we are stuck with a world full of ghost branches. There is one distinguished branch, which is the one guiding the particle, which we may call the occupied branch. Nonetheless, the unoccupied ghost branches are also real. The wave function of which they are branches is a beable.

The ghost branches of pilot wave theory are the same as the branches in the Many Worlds Interpretation. In both cases they are a consequence of having only Rule 1. Unlike the Many Worlds Interpretation, pilot wave theory requires no exotic ontology in terms of many universes, or a splitting of observers, because there is always a single occupied branch where the particle resides. So there is no problem of principle, nor is there a problem of defining what we mean by probabilities. But if one finds it inelegant to have every possible history of the world represented as an actuality, that sin is common to Many Worlds and pilot wave theory.

(Note: Rule 1 is simply "Given the quantum state of an isolated system at one time, there is a law that will predict the precise quantum state of that system at any other time." Smolin called this law Rule 1. "It is also sometimes called the Schrödinger equation. The principle that there is such a law is called unitarity."

Why didn't Demysitifer worry about the ghost branches? Are there many kinds of Bohmians with regards to how they treat the wave function? How does Demystifer and other Bohmians treat it compared to Smolin?
 
  • #40
vanhees71 said:
But the quote is obviously wrong, because we very well can use quantum theory to describe real-world experiments, and there's both the notion of the state, described by the statistical operator, its deterministic (!) time evolution, given the Hamiltonian of the system, and its probabilistic interpretation.

I think that what David Wallace was saying is obviously right. He didn't say that we can't use quantum theory. He was saying that in practice we treat macroscopic quantities different from microscopic in an ad-hoc way. That's true. It doesn't make quantum theory useless, but it makes it "softly inconsistent", to use my own phrase.

The state is determined by a preparation procedure, and it implies that not all observables of the system takes determined values, but a measurement of these observables give random results with probabilities given by the states. There's no contradiction in the sense of logic.

I think it is a contradiction. To me, the following two claims are just logically inconsistent (together with the rest of the quantum formalism)
  1. A measurement always produces an eigenvalue of the quantity being measured.
  2. Measurement devices and observers are themselves described by quantum mechanics.
 
  • Like
Likes Auto-Didact
  • #41
vanhees71 said:
Well, GR has built in its own failure, namely the unavoidable singularities of all physically relevant solutions. When it comes to the singularities of cosmology ("big bang") and black holes ("Schwarzschild, Kerr" of very compact objects), the physical laws of GR break down, and it's expected that an appropriate quantum treatment "cures" these deficiencies of the classical theory.
These hypothetical singularities can not be observed. Why should we care about them?

vanhees71 said:
Further, there's nothing in GR that forbids to think about gravity as an interaction. The geometrization even can be derived from this ansatz (e.g., due to Weinberg; see also Feynman's book, "The Feynman lectures on gravitation", which is a brillant textbook on the subject).
Of course, just because there is one valid theory does not mean there can't be other theory describing the same phenomena. But we have a theory that correctly predicts observable phenomena. So why bother?
 
  • #42
vanhees71 said:
What do you mean by this question? If somebody starts the question with "what's the nature/mechanism...", usually he or she has a conceptual misunderstanding what the natural sciences are methodologically aiming at

I am asking if you think the wavefunction of, say, an electron, describes 1) a new sort of spatially extended object, 2) a 0D classical point whose position is simply unknown, or 3) nothing but the probability of a classical detector click. I believe from you other comments your view is 3, in which case the measurement problem is akin to wondering: where did all these classical detectors come from in the first place? Why do we say they are made of electrons, if electrons on the quantum scale have no purchase as objectively existing objects?

My sense is your answer is going to be "who cares, the theory works." That's fine, but don't confuse not being interested in a problem with whether others are wrong to identify the problem as legitimate. Math works very well in practice, but Godel's incompleteness theorem is still an issue to contend with. The MP is similar in form.
 
  • Like
Likes Auto-Didact
  • #43
stevendaryl said:
I think that what David Wallace was saying is obviously right. He didn't say that we can't use quantum theory. He was saying that in practice we treat macroscopic quantities different from microscopic in an ad-hoc way. That's true. It doesn't make quantum theory useless, but it makes it "softly inconsistent", to use my own phrase.
I think it is a contradiction. To me, the following two claims are just logically inconsistent (together with the rest of the quantum formalism)
  1. A measurement always produces an eigenvalue of the quantity being measured.
  2. Measurement devices and observers are themselves described by quantum mechanics.
Why is there a contradiction? Is there any empirical evidence that 1. is wrong? If that were the case, there'd be a big crises of QT as a whole, and every theorist would struggle to find an alternative theory ;-).

Concerning 2. it's to some degree a matter of taste whether you except the standard quantum-statistical arguments as "description" of the measurement devices or not. So again, about this you can fight enternelly without coming to any conclusion either.

That there's any necessity to also describe us as quantum systems too to solve the "measurement problem" is somewhat exotic to me, because there's really no more direct interaction between us and the measured system (except in the case that you consider our own senses as measurement device of quantum systems, like the very interesting possibility to use our eyes directly as single-photon detectors which seems to be possible in principle according to new studies on the subject).
 
  • #44
zonde said:
These hypothetical singularities can not be observed. Why should we care about them?Of course, just because there is one valid theory does not mean there can't be other theory describing the same phenomena. But we have a theory that correctly predicts observable phenomena. So why bother?
What other theory you are talking about. GR is GR, no matter whether you describe gravity as an interaction or insist on the quite common interpretation that it's entirely a kinematical effect of curved spacetime. For me the interpretation of gravity as an interaction as all the other fundamental interactions (i.e., electroweak and strong interactions) is of some attraction, because it's simplifying and unifying the picture, but that's again just a matter of personal taste of little importance in the sense of science.
 
  • #45
charters said:
I am asking if you think the wavefunction of, say, an electron, describes 1) a new sort of spatially extended object, 2) a 0D classical point whose position is simply unknown, or 3) nothing but the probability of a classical detector click. I believe from you other comments your view is 3, in which case the measurement problem is akin to wondering: where did all these classical detectors come from in the first place? Why do we say they are made of electrons, if electrons on the quantum scale have no purchase as objectively existing objects?

My sense is your answer is going to be "who cares, the theory works." That's fine, but don't confuse not being interested in a problem with whether others are wrong to identify the problem as legitimate. Math works very well in practice, but Godel's incompleteness theorem is still an issue to contend with. The MP is similar in form.
The wave function describes probabilities for measurement results no more no less. It's wrong to say an electron is the wave function (the more that at the most fundamental level wave functions do not make much sense but are quantized themselves). So indeed I think 3) describes my point of view best.

The classical detectors come from the physicists' curiosity to learn more about nature. That's why they build with some effort ever better ones (and that's often pretty expensive and we can be lucky to get them financed by tax-payers money).

That matter around us, and thus also measurement devices, are made of atomic nuclei and electrons is the conclusion that we very well understand their properties as many-body systems with atomic nuclei and electrons as the relevant degrees of freedom. That's also known as condensed-matter physics and a very successful application of quantum (field) theory. It's so successful that we have all the funny gadgets like the laptop I'm writing this text on and also to construct ever better measurement devices for all kinds of measurements on quantum systems down to the most fundamental building blocks, as far as we know them, and perhaps one day helping us to find even new ones.
 
  • #46
vanhees71 said:
The wave function describes probabilities for measurement results no more no less. It's wrong to say an electron is the wave function (the more that at the most fundamental level wave functions do not make much sense but are quantized themselves). So indeed I think 3) describes my point of view best.
vanhees71 said:
That matter around us, and thus also measurement devices, are made of atomic nuclei and electrons is the conclusion that we very well understand their properties as many-body systems with atomic nuclei and electrons as the relevant degrees of freedom.

So you claim free electrons do not exist - neither as an extended object nor as a classical point, as in options 1 and 2 in post # 42. But you also claim electrons suddenly do start to exist when composing macroscopic, many body systems.
 
  • #47
vanhees71 said:
Why is there a contradiction? Is there any empirical evidence that 1. is wrong? If that were the case, there'd be a big crises of QT as a whole, and every theorist would struggle to find an alternative theory ;-).

For a while, maybe. But if no satisfactory alternative theory was found, many of them would shift to pretending that the evidence doesn't REALLY show what it seems to show. In other words, many theorist would just live in denial about it. And that's exactly what we do have.

Concerning 2. it's to some degree a matter of taste whether you except the standard quantum-statistical arguments as "description" of the measurement devices or not. So again, about this you can fight enternelly without coming to any conclusion either.

The reason there is eternal fighting about it is because 1 & 2 are contradictory, and there is no agreement about how to fix it. On the other hand, there is a "rule of thumb" that allows us to get past the contradiction, which is that we have heuristics for when to treat a system as a measurement device obeying 1, and when to treat it as a quantum mechanical system. You can't consistently do both.

Let me sketch a scenario showing that rule 1 is contradictory with rule 2.

Suppose that you have a Stern-Gerlach type situation in which an electron that is spin-up in the z-direction is deflected left, and makes a black spot on the left side of a photographic plate. An electron that is spin-down makes a black spot on the right side.

Now suppose that we set up initial conditions that are precisely left-right symmetric. We send an electron through the Stern-Gerlach device that is spin-up in the x-direction.

According to rule 1, eventually the system evolves into a final state that is not left-right symmetric. Either the electron goes left and makes a spot on the left, or goes right and makes a spot on the right. So the final state does not satisfy the left-right symmetry of the initial state.

That is not possible, if everything obeys the laws of quantum mechanics. If the initial state is left-right symmetric, and the Hamilton is similarly left-right symmetric, then the final state will be left-right symmetric.
 
  • Like
Likes eloheim, Auto-Didact, DanielMB and 2 others
  • #48
vanhees71 said:
I think, as with the question about a consistent QT of gravity, we'd need some empirical evidence clearly indicating that there's a real problem in describing an unanimously observed phenomenon which contradicts QT.
Your quote was specifically about QT of gravity (I think), but I'd like to say that in my opinion the best thing that could happen to quantum physics overall is empirical evidence that contradicts QT. If that would happen, things maybe would start to go in new, interesting directions. :smile:
 
  • Like
Likes eloheim
  • #49
charters said:
So you claim free electrons do not exist - neither as an extended object nor as a classical point, as in options 1 and 2 in post # 42. But you also claim electrons suddenly do start to exist when composing macroscopic, many body systems.
I do not claim that free electrons do not exist. Where do you get this from? Of course free electrons exist. They are, of course, neither well described as a classical point particle (there's no consistent description of a classical point particle anyway) nor as a classical extended objects. That's why we use Q(F)T to describe them. According to the standard model they are described as the charged leptons (spin-1/2 Dirac fermions), thus carrying electric and WISO charges but no color charge.

Measurement devices are composed of atomic nuclei (protons and neutrons, which themselves are bound states of quarks and gluons) and electrons. That's what you have asked about not about the existence or non-existence of free electrons. Why should my statement above imply such obvious nonsense of non-existence of free electrons?
 
  • #50
stevendaryl said:
For a while, maybe. But if no satisfactory alternative theory was found, many of them would shift to pretending that the evidence doesn't REALLY show what it seems to show. In other words, many theorist would just live in denial about it. And that's exactly what we do have.
The reason there is eternal fighting about it is because 1 & 2 are contradictory, and there is no agreement about how to fix it. On the other hand, there is a "rule of thumb" that allows us to get past the contradiction, which is that we have heuristics for when to treat a system as a measurement device obeying 1, and when to treat it as a quantum mechanical system. You can't consistently do both.

Let me sketch a scenario showing that rule 1 is contradictory with rule 2.

Suppose that you have a Stern-Gerlach type situation in which an electron that is spin-up in the z-direction is deflected left, and makes a black spot on the left side of a photographic plate. An electron that is spin-down makes a black spot on the right side.

Now suppose that we set up initial conditions that are precisely left-right symmetric. We send an electron through the Stern-Gerlach device that is spin-up in the x-direction.

According to rule 1, eventually the system evolves into a final state that is not left-right symmetric. Either the electron goes left and makes a spot on the left, or goes right and makes a spot on the right. So the final state does not satisfy the left-right symmetry of the initial state.

That is not possible, if everything obeys the laws of quantum mechanics. If the initial state is left-right symmetric, and the Hamilton is similarly left-right symmetric, then the final state will be left-right symmetric.
Obviously I'm to stupid to understand, why there's a contradiction between 1 and 2. Particularly your example of an apparent paradox is completely incomprehensible to me, because it contradicts the standard interpretation of QT. Of course, as Bohr has already analyzed, you cannot perform the SG expeirment with free electrons in practice. So let me put Ag atom instead of electron (because that was the atom Stern and Gerlach used at the time)

If you send an Ag atom with indetermined spin-z component through a Stern-Gerlach apparatus it is deflected with the probabilities given by the corresponding (pure or mixed) state either to the left or the right. That's the point, why Born introduced the probability interpretation of the quantum state in the first place: You cannot split a single Ag atom into pieces, because you never find a smeared classical-field like entity that's smeared according to the wave-function squared, but you always find a single spot on the screen after it run through the SG-magnet. It ends either "to the left" or "to the right" and thus either, through the entanglement between position and spin-z component through the running through the magnet we conclude that the spin component is either up or down, depending on where the electron landed. You can even prepare pure spin-z eigenstates by just looking at the corresponding partial beam (through filtering away all electrons running in the other region of space). I.e., here you have a paradigmatic example for a von Neumann filter measurement. Of course, for the single electron there's no way to predict, which value will be found. You can only say that with the probabilities given by the state the Ag atom is prepared in

In the case of the original experiment it was a beam of Ag atoms from an oven running through a little opening, so it's some mixed state of roughly given by (written as product of the spatial/momentum part and the spin part)
$$\hat{\rho} \propto \int_{\text{aperture}} \mathrm{d}^3 p \exp[-\vec{p}^2/(2m k T)] |\vec{p} \rangle \langle \vec{p}| \otimes \hat{1}_{\text{spin}}/2.$$
Of all Ag atoms the probability for either spin-z component up or down is 1/2, i.e., the symmetric situation you assume. Of course, each single atom will go either the one or the other direction, and for the single atom the situation is not symmetric. Only the probability distribution is symmetric, and that's what's also predicted by QM. This can be even calculated in good approximation analytically (I've still not found the time to write this up completely, but it's really not too complicated).

The final state of the Ag atom is a quantum state again and only describes probabilities, and this distribution obeys the left-right symmetry you rightly assume.

In experimental terms: What's symmetric is the distribution of many Ag atoms, all prepared in the same initial state, running through the setup. Each single atom "breaks" the symmetry of course, landing either "left" or "right" from the symmetry plane.

It's the same with any random experiment. Although a perfectly fair die is symmetric, any outcome is showing 1 of the six edges and thus breaks the cubic symmetry. Only "statistically", i.e., the "average" over many outcomes is symmetric, i.e., the probability for each outcome is the same, 1/6.
 
  • Like
Likes DanielMB
Back
Top