The Fundamental Difference in Interpretations of Quantum Mechanics - Comments

In summary, the conversation discusses the fundamental difference in interpretations of quantum mechanics, specifically in regards to the concept of "physically real." The two viewpoints presented are that the quantum state is either physically real (represented by the wave function in the math), or that it is not real but simply a tool for making predictions. The conversation also touches on the idea of classical mechanics and the difficulty in defining "physically real." The conversation also delves into the concept of an actual wave in quantum mechanics and different interpretations of its reality.
  • #71
For #1, the obviously true part is that we can never directly observe the state, and we can never make deterministic predictions about the results of quantum experiments. That makes it seem obvious that the state can’t be the physically real state of the system; if it were, we ought to be able to pin it down and not have to settle for merely probabilistic descriptions. But if we take that idea to its logical conclusion, it implies that QM must be an incomplete theory; there ought to be some more complete description of the system that fills in the gaps and allows us to do better than merely probabilistic predictions. And yet nobody has ever found such a more complete description, and all indications from experiments (at least so far) are that no such description exists; the probabilistic predictions that QM gives us really are the best we can do.

I'm not quite clear on why it is that if "we can never make deterministic predictions about the results of quantum experiments" that this implies non-reality, as opposed, for example, to a system that is chaotic (in the sense of having dynamics that are highly sensitive to slight changes in initial conditions) with sensitivity to differences in initial conditions that aren't merely hard to measure, but are inherently and theoretically impossible to measure because measurement is theoretically incapable of measuring both location and momentum at the scale relevant to the future dynamics of a particle.

Now, I'm not saying that there aren't other aspects of QM that make a chaotic system with real particles interpretation problematic - I'm thinking of experiments that appear to localize different properties of the same particle in different physical locations, or little tricks like off-shell virtual particles and quantum tunneling. But, chaotic but deterministic systems can look so much like truly random systems phenomenologically for lots of purposes (which is why people invented tools like dice, lottery ball spinners, slot machines, card decks, and roulette wheels), so you can have a deterministic and stochastic conceptions of QM that are indistinguishable experimentally, at least for many purposes, but which have profoundly different theoretical implications. But, then again, maybe I'm wrong and there are easy ways to distinguish between the two scenarios.

Also, I do hear you when you say that the question is whether the "state" is real, and not just whether particles are real, and the "state" is a much more empheral, ghost-like thing than a particle.

I am also unclear with regard to whether the "reality" that you are discussing is the same as the "reality" people are talking about in QM when they state that given quantum entanglement, you can have locality, reality, or causality, but you can't simultaneously have all three, or whether you are talking about something different. To be clear, I'm not asking the more ambitious question of what "reality" means, only the less ambitious question of whether one kind of reality that is hard to define non-mathematically is the same as another kind of reality that is also hard to define non-mathematically. It could be that "reality" is instead two different concept that happens to share the same name and both of which are hard to define non-mathematically, in which case the term is a "false friend" as they say in foreign language classes.
 
Last edited:
  • Like
Likes Boing3000 and Auto-Didact
Physics news on Phys.org
  • #72
ohwilleke said:
I'm not quite clear on why it is that if "we can never make deterministic predictions about the results of quantum experiments" that this implies non-reality, as opposed, for example, to a system that is chaotic (in the sense of having dynamics that are highly sensitive to slight changes in initial conditions)

I didn't want to overcomplicate the article, but this is a fair point: there should really be an additional qualifier that the dynamics of QM, the rules for how the quantum state changes with time, are linear, so there is no chaos--i.e., there is no sensitive dependence on initial conditions. You need nonlinear dynamics for that. So whatever is keeping us from making deterministic predictions about the results of quantum experiments, it isn't chaos due to nonlinear dynamics of the quantum state.

ohwilleke said:
I am also unclear with regard to whether the "reality" that you are discussing is the same as the "reality" people are talking about in QM when they state that given quantum entanglement, you can have locality, reality, or causality, but you can't simultaneously have all three, or whether you are talking about something different.

It's basically the same, at least to the extent that the term "reality" has a reasonably well-defined meaning at all in those discussions. The discussions about entanglement that you refer to are more relevant to my case #2: they are discussions of problems you get into if you try to interpret the quantum state in the model as modeling the physically real state of a quantum system. For case #1, entanglement and all of the phenomena connected to it are not an issue, because you don't have to believe that anything "real" happens to the system when your knowledge about it changes.
 
  • Like
Likes ohwilleke
  • #73
PeterDonis said:
I didn't want to overcomplicate the article, but this is a fair point: there should really be an additional qualifier that the dynamics of QM, the rules for how the quantum state changes with time, are linear, so there is no chaos--i.e., there is no sensitive dependence on initial conditions. You need nonlinear dynamics for that. So whatever is keeping us from making deterministic predictions about the results of quantum experiments, it isn't chaos due to nonlinear dynamics of the quantum state.

That is really helpful. I've never heard anyone say that quite that clearly before.

It's basically the same, at least to the extent that the term "reality" has a reasonably well-defined meaning at all in those discussions. The discussions about entanglement that you refer to are more relevant to my case #2: they are discussions of problems you get into if you try to interpret the quantum state in the model as modeling the physically real state of a quantum system. For case #1, entanglement and all of the phenomena connected to it are not an issue, because you don't have to believe that anything "real" happens to the system when your knowledge about it changes.

Thanks again. That makes sense.
 
  • #74
stevendaryl said:
When you're talking about huge numbers, there is the possibility of what I would call a "soft contradiction", which is something that maybe false but you're not likely to ever face the consequences of its falseness. An example from classical thermodynamics might be "Entropy always increases". You're never going to see a macroscopic violation of that claim, but our understanding of statistical mechanics tells us that it can't be literally true; there is a nonzero probability of a macroscopic system making a transition to a lower-entropy state.

I like that terminology. I'll have to file it away for future use.
 
  • #75
PeterDonis said:
I thought cases 1 and 2 in the article already described that, but I'll give it another shot.

Case 1 says the state is not real; it's just a description of our knowledge of the system, in the same sense that, for example, saying that a coin has a 50-50 chance of coming up heads or tails describes our knowledge of the system--the coin itself isn't a 50-50 mixture of anything, nor is what happens when we flip it, it's just that we don't know--we can't predict--how it is going to land, we can only describe probabilities.

Case 2 says the state is real, in the same sense that, for example, a 3-vector describing the position of an object in Newtonian physics is real: it describes the actual position of the actual object.

So, to give a concrete example. Suppose that we have a top quark and an ultra sensitive gravitational force sensor.

In Case 1, the gravitational force sensor is always going to report a gravitational force consistent with a point particle at all times.

In Case 2, the gravitational force sensor is going to report a gravitational force consistent with having the mass-energy of the top quark smeared predominantly within a volume of space that is not point-like, because the top quark is literally present at all of the places it could be when measured to a greater or lesser degree, simultaneously.

Or, I have a misunderstood something?
 
  • #76
ohwilleke said:
Suppose that we have a top quark and an ultra sensitive gravitational force sensor.

We don't have a good theory of quantum gravity, so this is not a good example to use, since we have no actual theoretical model on which to base predictions.

Also, if you find yourself thinking that case 1 and case 2 make different predictions, you are doing something wrong. The whole point of different interpretations of QM is that they all use the same underlying mathematical model to make predictions, so they all make the same predictions. If you have something that makes different predictions, it's not an interpretation of QM, it's a different theory.
 
  • #77
vanhees71 said:
Again, a theory book or scientific paper has not the purpose to tell how something is measured. That's done in experimental-physics textbooks and scientific papers! Of course, if you only read theoretical-physics and math literature you can come to the deformed view about physics that everything should be mathematically defined, but physics is no mathematics. It just uses matheamtics as a language to express its findings using real-world devices (including our senses to the most complicated inventions of engineering like the detectors at the LHC).
If the experimental-physicist tells the theoretical-physicist /how/ to measure something, the latter tells the former /what/ he is actually measuring. :-)
Example: before 1905, experimental-physicists performing photoelectric effect were measuring "light"; after, with A. Einstein, they were measuring "photons" .

--
lightarrow
 
  • #78
PeterDonis said:
Also, if you find yourself thinking that case 1 and case 2 make different predictions, you are doing something wrong. The whole point of different interpretations of QM is that they all use the same underlying mathematical model to make predictions, so they all make the same predictions. If you have something that makes different predictions, it's not an interpretation of QM, it's a different theory.

Isn't the reason that people would care about case 1 v. case 2 that even if they are indistinguishable in practice for the foreseeable future, or at least, even if it is not even theoretically possible to ever distinguish the two even in theory, that one could imagine some circumstances either with engineering that is literally impossible in practice (along the lines of Maxwell's Demon), or (e.g. with Many Worlds) with an observer located somewhere that it is impossible for anyone from the perspective of our universe at a time long after the Big Bang to see, where case 1 and case 2 would imply different things?

Likewise, even if case 1 v. case 2 are indistinguishable in the world of SM + GR core theory, wouldn't a distinction between one and the other have implications for what kind of "new physics" would be more promising to investigate in terms of BSM hypothesis generation?

For example, suppose the "state" of case 2 is "real" (whatever that means). Might that not suggest that brane-like formulations of more fundamental theories might make more sense to investigate than they would if case 1 is a more apt interpretation?

I certainly get the feeling that people who are debating the interpretations feel like they are arguing over more than semantics and terminology. After all, if it really were only just purely semantics, wouldn't the argument between the interpretations be more like the argument between drill and kill v. New Math ways to teaching math: i.e., between which is easier for physics students to learn and grok more quickly in a way that gives them the most accurate intuition when confronted with a novel situation (something that incidentally is amenable to empirical determination), rather than over which is right in a philosophical way?
 
  • #79
ohwilleke said:
even if they are indistinguishable in practice

They aren't indistinguishable "in practice"; they're indistinguishable period.

A better way of asking the question you might be trying to ask is, do people care about case 1 vs. case 2 because of the different ways the two cases suggest of looking for a more comprehensive theory of which our current QM would be a special case? The answer to that is yes; case 1 interpretations suggest different possibilities to pursue for a more comprehensive theory than case 2 interpretations do. Such a more comprehensive theory would indeed make different predictions from standard QM for some experiments. But the interpretations themselves are not the more comprehensive theories; they make the same predictions as standard QM, because they are standard QM, not some more comprehensive theory.

ohwilleke said:
I certainly get the feeling that people who are debating the interpretations feel like they are arguing over more than semantics and terminology.

Yes, they do. But unless they draw the key distinction I am drawing between interpretations of an existing theory, standard QM, and more comprehensive theories that include standard QM as a special case, they are highly likely to be talking past each other. Which is indeed one common reason why discussions of QM interpretations go nowhere.
 
  • #80
PeterDonis said:
A better way of asking the question you might be trying to ask is, do people care about case 1 vs. case 2 because of the different ways the two cases suggest of looking for a more comprehensive theory of which our current QM would be a special case? The answer to that is yes; case 1 interpretations suggest different possibilities to pursue for a more comprehensive theory than case 2 interpretations do. Such a more comprehensive theory would indeed make different predictions from standard QM for some experiments. But the interpretations themselves are not the more comprehensive theories; they make the same predictions as standard QM, because they are standard QM, not some more comprehensive theory.

That was part of the question I was trying to ask.

Going back to one of the other issues that doesn't get into a more comprehensive theory, is there any serious research into which interpretation is most effective pedagogically? If not, what is your intuition on that point?
 
  • #81
ohwilleke said:
is there any serious research into which interpretation is most effective pedagogically?

I don't know of any, but I'm not at all up to speed on this kind of research.

ohwilleke said:
what is your intuition on that point?

I don't know that my intuition means much; it's extremely difficult for most people who already know a subject in detail to imagine how people who don't have that knowledge will respond to different ways of conveying it. But FWIW, my intuition is that the "shut up and calculate" interpretation is pedagogically the best place to start, because until you understand the underlying math and predictions of QM, trying to deal with any interpretation on top of that is more likely to confuse you than to help you.
 
  • Like
Likes ohwilleke and bhobba
  • #82
PeterDonis said:
I don't know that my intuition means much; it's extremely difficult for most people who already know a subject in detail to imagine how people who don't have that knowledge will respond to different ways of conveying it. But FWIW, my intuition is that the "shut up and calculate" interpretation is pedagogically the best place to start, because until you understand the underlying math and predictions of QM, trying to deal with any interpretation on top of that is more likely to confuse you than to help you.
Feynman's response to this is extremely apt, dare I say prescient:
 
  • Like
Likes Buzz Bloom, Boing3000, akvadrako and 1 other person
  • #83
ohwilleke said:
I'm not quite clear on why it is that if "we can never make deterministic predictions about the results of quantum experiments" that this implies non-reality
Definition of "reality" It's a humain being viewpoint ... popper's three worlds, Heisenberg's 3 regions of knowledge, ...
Jean Cavaillès said:
If any physical law is merely a bet on action, the scandal of probability ceases: far from being an inadequate substitute for our power to know, it is the springboard of all scientific activity.

It is easier to aknowledge that physical laws are gambles of action when no way of interpreting them as descriptions of an independent, detached, "primary" nature is unanimously accepted ... Instead: an inventory of relations-with/withing-nature

upload_2018-1-13_7-17-15.png


upload_2018-1-13_7-36-31.png


Best regards
Patrick
 

Attachments

  • upload_2018-1-13_7-17-15.png
    upload_2018-1-13_7-17-15.png
    24.7 KB · Views: 530
  • upload_2018-1-13_7-36-31.png
    upload_2018-1-13_7-36-31.png
    37.5 KB · Views: 556
Last edited:
  • Like
Likes AlexCaledin
  • #84
Auto-Didact said:
dare I say prescient:
That hit the bullseye indeed!

(I had actually never seen that particular clip of feynmann before)

/Fredrik
 
  • Like
Likes Auto-Didact
  • #85
ohwilleke said:
That is really helpful. I've never heard anyone say that quite that clearly before.
This surprises me somewhat, I was under the impression that the linearity (or unitarity) of QM was extremely well recognized, i.e. the fact that conventional QM has absolutely nothing to do with nonlinear dynamics; this is exactly why I for example believe that QM can be only at best a provisional theory, because almost all phenomena in nature are inherently non-linear, which is reflected in the fact that historically almost all fundamental theories in physics were special linearized limiting cases which eventually got recast into their more correct non-linear form. This trend continues to this very day; just take a look at condensed matter physics, hydrodynamics, biophysics and so on.
bhobba said:
So I have zero idea where you are getting this from - its certainly not from textbooks that carefully examine QM. Textbooks at the beginner/intermediate level sometimes have issues - but they are fixed in the better, but unfortunately, more advanced texts
I got this from having read several papers, systematic reviews, conference transcripts and books on the subject. I don't have much time to spare atm but will link to them later if needed.

As for the axiomatic treatment a la Ballentine, I believe the others have answered that adequately, but I will reiterate my own viewpoint: that is a mathematical axiomatization made in a similar vein to measure theoretic probability theory, not a physical derivation from first principles.
bhobba said:
That leads to exactly the same situation as QM - a deterministic equation describing a probabilistic quantity. Is that inconsistent too? Of course not. Inconsistency - definition: If there are inconsistencies in two statements, one cannot be true if the other is true. Obviously it can be true that observations are probabilistic and the equation describing those probabilities deterministic. There is no inconsistency at all,
Your analogy fails because what you are describing there are internal parts, i.e. a hidden variables theory; for QM these types of theories are ruled out experimentally by various inequality theorems (Bell, Leggett), at least assuming locality, but this seems to be an irrelevant digression.

In any case, what you go on to say here makes it clear that you are still missing my point, namely where exactly the self-inconsistency of QM arises, i.e. not merely of measurement or of the Schrodinger equation, but of full QM: the full theory is not self-consistently captured by a single mathematical viewpoint, like say from within analysis or within geometry. It is instead usually analytic and sometimes stochastically discontinuous. I do not believe that Nature is actually schizofrenic in this manner, and I believe that the fact that QM is displaying these pathological symptoms is due to the physical theory displaying the failure of being merely some linearized limiting case.

Imagine it like this, say you have an analytic function on some time domain, where you artificially introduce in cuts and stochastic vertical translations to the function at certain parts and so manually introduce in discontinuity. Is this new function with artificially added in discontinuities still appropriately something akin to an analytic function? Or is there perhaps a novel kind of mathematical viewpoint/treatment needed to describe such an object, similar to how distribution theory was needed for the Dirac delta? I don't know, although I do have my own suspicions. What I do know is that insisting that 'QM as is is fully adequate by just using the ensemble interpretation; just accept it as the final theory of Nature' does not help us one bit further in going beyond QM.

Lastly, I think Penrose here says quite a few sensible things and extremely relevant things on exactly the topic at hand:
 
  • Like
Likes junjunjun233
  • #86
ohwilleke said:
I like that terminology. I'll have to file it away for future use.
Just for reference the more convetional terminology for the various kinds of inferences here are, deductive vs inductive inference.

"hard contradictions" are typically what you get in deductive logic, as this deals with propositions that are true or false.
"soft contradictions" are more of the probabilistic kind, where you have various degrees of beliefs or support in certain propositions.

The history or probability theory has its roots in inductive reasoning and its philosophy. The idea was that in order to make inductive reasoning rational and objective, one can simply "count evidence", and construct a formal measure of "degree of belief". That is one way to understanding the roots of probability theory. Probability theory is thus one possible mathematical model for rational inference.

Popper was also grossly disturbed by the fact that science seemed to be an inductive process, and he wanted to "cure this" buy supress the inductive process of how to generate a new hypothesis from a falsified theory in a rational way, and instead focus on the deductive part: falsification. Ie. his idea was that the progress of science is effectively made at the falsification of a theory - this is also the deductive step - which popper liked! But needless to say this analysis is poor and inappropriate.

Obviously deductive reasoing is cleaner and easier. So if its possible, its not hard to see the preference. But unfortunately reality is not well described by pure deductive reasoning. Propositions corresponding to precesses in nature are rarely easily characterised as true or false.

The intereresting part (IMO) is the RELATION between deductive and inductive reasoning WHEN you take into acount the physical limits of the "hardware" that executes the inferences. This is exactly my personal focus, and how this relates to foundational physics, and the notion of physical law, which is deductive vs the inductive nature of "measurement", which merely "measures nature" by accounting for evidence, in an inductive way.

But almost no people think along these line, I've learned, so this is why i am an oddball here.

/Fredrik
 
  • Like
Likes Auto-Didact
  • #87
Fra said:
Just for reference the more convetional terminology for the various kinds of inferences here are, deductive vs inductive inference.

"hard contradictions" are typically what you get in deductive logic, as this deals with propositions that are true or false.
"soft contradictions" are more of the probabilistic kind, where you have various degrees of beliefs or support in certain propositions.

The history or probability theory has its roots in inductive reasoning and its philosophy. The idea was that in order to make inductive reasoning rational and objective, one can simply "count evidence", and construct a formal measure of "degree of belief". That is one way to understanding the roots of probability theory. Probability theory is thus one possible mathematical model for rational inference.

Popper was also grossly disturbed by the fact that science seemed to be an inductive process, and he wanted to "cure this" buy supress the inductive process of how to generate a new hypothesis from a falsified theory in a rational way, and instead focus on the deductive part: falsification. Ie. his idea was that the progress of science is effectively made at the falsification of a theory - this is also the deductive step - which popper liked! But needless to say this analysis is poor and inappropriate.

Obviously deductive reasoing is cleaner and easier. So if its possible, its not hard to see the preference. But unfortunately reality is not well described by pure deductive reasoning. Propositions corresponding to precesses in nature are rarely easily characterised as true or false.

The intereresting part (IMO) is the RELATION between deductive and inductive reasoning WHEN you take into acount the physical limits of the "hardware" that executes the inferences. This is exactly my personal focus, and how this relates to foundational physics, and the notion of physical law, which is deductive vs the inductive nature of "measurement", which merely "measures nature" by accounting for evidence, in an inductive way.

But almost no people think along these line, I've learned, so this is why i am an oddball here.

/Fredrik

My viewpoint is that deduction and induction are a false dichotomy, for there is an excluded middle, namely Pierce's abduction. Abduction has historically gotten a bad reputation due to it actually being an example of fallacious reasoning, but even so, it seems to be an effective way of thinking; only a puritan logicist would try to insist that fallacious reasoning was outright forbidden, but I digress.

Induction may be necessary to generalize and so generate hypotheses, but inference to the best explanation, i.e. abduction or just bluntly guessing (in perhaps a Bayesian manner) is the only way to actually select a hypothesis from a multitude of hypotheses which can then be compared to experiment; if the guessed hypothesis turns out to be false, just rinse and repeat.

Here is where my viewpoint not just diverges away from standard philosophy of science, but also from standard philosophy of mathematics: in my view not only is Pierce's abduction necessary to choose scientific hypotheses, abduction seems more or less at the basis of human reasoning itself. For example, if we observe a dark yellowish transparant liquid in a glass in a kitchen, one is easily tempted to conclude it is apple juice, while it actually may be any of a million other things, i.e. it is possibly any of a multitude of things. Yet our intuition based on our everyday experience will tell us that it probably is apple juice; if we for some reason doubt that, we would check it by smelling or tasting or some other means of checking and then updating our idea what it is accordingly. (NB: contrast probability theory and possibility theory).

But if you think about this even more carefully, we can step back and ask if the liquid was even a liquid, if the cup was even a glass, and so on. In other words, we seem to be constantly be abducing without even being aware that we are doing so or even mistakenly believing we are deducing; the act of merely describing things we see in the world around us in words already seems to require the use of abductive reasoning.

Moreover, much of intuition also seems to be the product of abductive reasoning, which would imply that abduction lays at the heart of mathematical reasoning as well. There is actually a modern school of mathematics, namely symbolism, which seems to be arguing as much although not nearly as explicitly as I am doing here (here is a review paper on mathematical symbolism). In any case, if this is actually true it would mean that the entire Fregean/Russellian logicist and Hilbertian formalist schools and programmes are hopelessly misguided; coincidentally, since @bhobba mentioned him before, Wittgenstein happened to say precisely that logicism/formalism were deeply wrong views of mathematics; after having carefully thought about this issue for years, I tend to be in agreement with Wittgenstein on this.
 
Last edited:
  • #88
Auto-Didact said:
My viewpoint is that deduction and induction are a false dichotomy, for there is an excluded middle, namely Pierce's abduction. Abduction has historically gotten a bad reputation due to it actually being an example of fallacious reasoning, but even so, it seems to be an effective way of thinking; only a puritan logicist would try to insist that fallacious reasoning was outright forbidden, but I digress.
...
Induction may be necessary to generalize and so generate hypotheses, but inference to the best explanation, i.e. abduction or just bluntly guessing (in perhaps a Bayesian manner) is the only way to actually select a hypothesis from a multitude of hypotheses which can then be compared to experiment; if the guessed hypothesis turns out to be false, just rinse and repeat.
You are right of course there are much to elaborate here! (my focus was not on strict dichotomies or not, just a quick comment that this question belongs to a general analysis of inferences)

But a more elaborated treatment would risk diverging. In science indeed abductive reasoning is the right term for "induction of a rule". The exact relation here, and the relevance to physical law and interaction between information processing agents is exactly my core focus. But that whole discussion would quickly go off topic, and off rules.

/Fredrik
 
  • #89
lightarrow said:
If the experimental-physicist tells the theoretical-physicist /how/ to measure something, the latter tells the former /what/ he is actually measuring. :-)
Example: before 1905, experimental-physicists performing photoelectric effect were measuring "light"; after, with A. Einstein, they were measuring "photons" .

--
lightarrow
Yes, and in 1926ff we learned that this is a misleading statement. The explanation of the photoelectric effect on the level of Einstein's famous 1905 paper does not necessitate the quantization of the electromagnetic field but only of the (bound) electrons.

https://www.physicsforums.com/insights/sins-physics-didactics/
 
  • Like
Likes bhobba
  • #90
Fra said:
I find the different perspectives interacting here truly entertaining.We can probably agree that physics is not logic, nor mathematics, nor is it philosophy. But all ingredients are needed, this is why i think physics is so much more fun than pure math.

I think Neumaier said this already elsewhere but there is also a difference in progressing science or creating new sensible hypothesis, and applying mature science to technology. Its not a coincidence that the founders of quantum theory seemed to be very philosophical, and the people that some years later formalized and cleaned up the new ideas was less so. I think it is deeply unfair to somehow suggest that the founders like Bohr or Heisenberg was someone inferior physicists than those that worked out the mathematical formalism better in an almost axiomatic manner. This is not so at all! I think all the ingredients are important. (Even if no one said the word inferior, its easy to get almost that impression, that the hard core guys to math, and the others do philosophy)

On the other hand, RARE are those people that can span the whole range! It takes quite some "dynamic" mindset, to not only understand complex mathematics, but also the importance or sound reasoning and how to create feedback between abstraction and fuzzy reality. If you narrow in too much anywhere along this scale you are unavoidable going to miss the big picture.

As for "wild", I think for some pure theorists and philosophers even a soldering iron my be truly wild stuff! Who knows what can go wrong? You burn tables books and fingers. Leave it to experts ;-)

/Fredrik
I'd say Bohr (in 1912) and Heisenberg (in 1925) were very important in discovering the new theory to give the more mathematically oriented people the idea to work out. To study Bohr and Heisenberg is historically very interesting but it's not a good way to learn quantum theory since their writings tend to clutter the physics with superfluous philosophical balast which confuses the subject more than it helps to understand it. Of course, you need all kinds of thinkers to make progress in physics, and the philsosophical or more intuitive kind like Bohr and Heisenberg is in no way inferior to the physics/math or more analytical type like Dirac or Pauli. A singular exception is Einstein, who was both since on the one hand he was very intuitive, but he also knew about the necessity of the clear analytical mathematical formulation of the theory, for which part he usually had mathematical help from his collaborators (like Großmann in the case of GR).
 
  • #91
From Physics and Philosophy by Werner Heisenberg

... one of the most important features of the development and the analysis of modern physics is the experience that the concepts of natural language, vaguely defined as they are, seem to be more stable in the expansion of knowledge than the precise terms of scientific language, derived as an idealization from only limited groups of phenomena. This is in fact not surprising since the concepts of natural language are formed by the immediate connection with reality; they represent reality. It is true that they are not very well defined and may therefore also undergo changes in the course of the centuries, just as reality itself did, but they never lose the immediate connection with reality. On the other hand, the scientific concepts are idealizations; they are derived from experience obtained by refined experimental tools, and are precisely defined through axioms and definitions. Only through these precise definitions is it possible to connect the concepts with a mathematical scheme and to derive mathematically the infinite variety of possible phenomena in this field. But through this process of idealization and precise definition the immediate connection with reality is lost. The concepts still correspond very closely to reality in that part of nature which had been the object of the research. But the correspondence may be lost in other parts containing other groups of phenomena.
 
Last edited:
  • Like
Likes Auto-Didact and Lord Jestocost
  • #92
vanhees71 said:
Yes, and in 1926ff we learned that this is a misleading statement. The explanation of the photoelectric effect on the level of Einstein's famous 1905 paper does not necessitate the quantization of the electromagnetic field but only of the (bound) electrons.

https://www.physicsforums.com/insights/sins-physics-didactics/
I didn' want to be the one to say that and I was pretty sure you or Arnold would have corrected me :smile:
Thanks.
--
lightarrow
 
  • #93
vanhees71 said:
...superfluous philosophical balast

Bohr and Heisenberg were confronted with simple materialistic views that prevailed in the natural science of the nineteenth century and which were still held during the development of quantum theory by, for example, Einstein. What you call “philosophical ballast“ are at the end nothing else but attempts to explain to Einstein and others that the task of “Physics” is not to promote concepts of materialistic philosophy.
 
  • Like
Likes AlexCaledin
  • #94
ohwilleke said:
I'm not quite clear on why it is that if "we can never make deterministic predictions about the results of quantum experiments" that this implies non-reality, as opposed, for example, to a system that is chaotic (in the sense of having dynamics that are highly sensitive to slight changes in initial conditions) with sensitivity to differences in initial conditions that aren't merely hard to measure, but are inherently and theoretically impossible to measure because measurement is theoretically incapable of measuring both location and momentum at the scale relevant to the future dynamics of a particle.

Well, Bell's theorem was the result of investigating exactly this question: How do we know that the indeterminacy of QM isn't due to some unknown dynamics that are just too complicated to extract deterministic results from? The answer is: As long as the dynamics is local (no instantaneous long-range interactions), you can't reproduce the predictions of QM this way.

The use of the word "realism" is a little confusing and unclear. But if you look at Bell's proof, a local, realistic theory is one satisfying the following conditions:
  1. Every small region in space has a state that possibly changes with time.
  2. If you perform a measurement that involves only a small region in space, the outcome can depend only on the state of that region (or neighboring regions--it can't depend on the state in distant regions)
Why is the word "realism" associated with these assumptions? Well, let's look at a classical example from probability to illustrate:

Suppose you have two balls, a red ball and a black ball. You place each of them into an identical white box and close it. Then you mix up the two boxes. You give one box to Alice, and another box to Bob, and they go far apart to open their boxes. We can summarize the situation as follows:
  • The probability that Alice will find a red ball is 50%.
  • The probability that Alice will find a black ball is 50%.
  • The probability that Bob will find a red ball is 50%.
  • The probability that Bob will find a black ball is 50%.
  • The probability that they both will find a black ball is 0.
  • The probability that they both will find a red ball is 0.
If you consider the probability distribution to be a kind of "state" of the system, then this system violates locality: The probability that Bob will find a red ball depends not only on the state of Bob and his box, but also on whether Alice has already found a red ball or a black ball. So this is a violation of condition 2 in my definition of a local realistic theory.

However, the correct explanation for this violation is that classical probability is not a realistic theory. To say that Bob's box has a 50% probability of producing a red ball is not a statement about the box; it's a statement about Bob's knowledge of the box. A realistic theory of Bob's box would be one that describes what's really in the box, a black ball or a red ball, and not Bob's information about the box. Of course, Bob may have no way of knowing what his box's state is, but after opening his box and seeing that it contains a red ball, Bob can conclude, using a realistic theory, "The box really contained a red ball all along, I just didn't know it until I opened it."

In a realistic theory, systems have properties that exist whether or not anyone has measured them, and measuring just reveals something about their value. (I wouldn't put it as "a measurement reveals the value of the property", because there is no need to assume that the properties are in one-to-one correspondence with measurement results. More generally, the properties influence the measurement results, but may not necessarily determine those results, nor do the results need to uniquely determine the properties).

Bell's notion of realism is sort of the opposite of the idea that our observations create reality. Reality determines our observations, not the other way around.
 
  • Like
Likes Buzz Bloom
  • #95
Lord Jestocost said:
Bohr and Heisenberg were confronted with simple materialistic views that prevailed in the natural science of the nineteenth century and which were still held during the development of quantum theory by, for example, Einstein. What you call “philosophical ballast“ are at the end nothing else but attempts to explain to Einstein and others that the task of “Physics” is not to promote concepts of materialistic philosophy.

What's funny (to me) about the anti-philosophy bent of so many physicists is that many of them actually do have deeply-held philosophical beliefs, but they prefer to only use the word "philosophy" to apply to philosophies that are different from their own.
 
  • Like
Likes Euthan, zonde, Lord Jestocost and 2 others
  • #96
AlexCaledin said:
From Physics and Philosophy by Werner Heisenberg

... one of the most important features of the development and the analysis of modern physics is the experience that the concepts of natural language, vaguely defined as they are, seem to be more stable in the expansion of knowledge than the precise terms of scientific language, derived as an idealization from only limited groups of phenomena. This is in fact not surprising since the concepts of natural language are formed by the immediate connection with reality; they represent reality. It is true that they are not very well defined and may therefore also undergo changes in the course of the centuries, just as reality itself did, but they never lose the immediate connection with reality. On the other hand, the scientific concepts are idealizations; they are derived from experience obtained by refined experimental tools, and are precisely defined through axioms and definitions. Only through these precise definitions is it possible to connect the concepts with a mathematical scheme and to derive mathematically the infinite variety of possible phenomena in this field. But through this process of idealization and precise definition the immediate connection with reality is lost. The concepts still correspond very closely to reality in that part of nature which had been the object of the research. But the correspondence may be lost in other parts containing other groups of phenomena.
Well, I think the opposite is true. With the refined means of the scientific effort we come closer and closer to reality. Our senses and "natural language" are optimized to survive under the specific "macroscopic" circumstances on Earth but not necessarily to understand realms of reality which are much different in scale than the one relevant for our survival like the microscopic scale of atoms, atomic nuclei, and subatomic/elementary particles or the very large scale of astronomy and cosmology. It's very natural to expect that our "natural language" is unsuitable to describe, let alone in some sense understand, what's going on at these vastly different scales. As proven by evidence the most efficient way to communicate about and to some extent understand nature on various scales is mathematics, and as with natural languages to learn a new language is never a loss but always a gain in understanding and experience.
 
  • Like
Likes bhobba
  • #97
Auto-Didact said:
As for the axiomatic treatment a la Ballentine, I believe the others have answered that adequately, but I will reiterate my own viewpoint: that is a mathematical axiomatization made in a similar vein to measure theoretic probability theory, not a physical derivation from first principles.

@bhobba I feel I need to expand on this by explaining what exactly the difference is between a formal axiomatization as is customarily used in contemporary mathematics since the late 19th/early 20th century and a derivation from first principles as was invented by Newton and is practically unaltered customarily used in physics up to this day. I will once again let Feynman do the talking, so just sit back and relax:


I hope this exposition makes things somewhat more clear. If it doesn't, well... here is just a little more elaboration explaining why physics is not mathematics (NB: especially regarding that the measurement process being described is part of physics and a scientific necessity not just some afterthought)


For those who really can't get enough of this, these videos are part of the lecture The Relation of Mathematics and Physics, which are part of Feynman's seven part Messenger lectures on the Character of Physical Law. These were intended for the public but in my opinion they should be compulsory viewing for all physics students.
 
Last edited:
  • Like
Likes AlexCaledin
  • #98
AlexCaledin said:
... one of the most important features of the development and the analysis of modern physics is the experience that the concepts of natural language, vaguely defined as they are, seem to be more stable in the expansion of knowledge than the precise terms of scientific language, derived as an idealization from only limited groups of phenomena.

To be fair he did not write that in light of future developments that shows the exact opposite to an even greater degree than was then known. But even then they knew the work of Wigner and Noether that showed it most definitely is NOT true. It can only be expressed in the language of math.

If you think otherwise state Noether's theorem in plain English without resorting to technical concepts that can only be expressed mathematically. Here is the theorem: Noether's first theorem states that every differentiable symmetry of the action of a physical system has a corresponding conservation law.

Mathematical concepts used - diferrentiable symmetry and action. If you can explain it in plain English - be my guest.

Thanks
Bill
 
Last edited:
  • #99
Auto-Didact said:
I feel I need to expand on this by explaining what exactly the difference is between a formal axiomatization as is customarily used in contemporary mathematics since the late 19th/early 20th century and a derivation from first principles as was invented by Newton

Newton - first principles - well let's look at those shall we:
Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration: relative, apparent and common time, is some sensible and external (whether accurate or unequable) measure of duration by the means of motion, which is commonly used instead of true time ...

What utter nonsense, and there is zero doubt Feynman would agree. I know those vireos you posted very well and they say nothing of the sort.

With all due respect to Newton, who as Einstein said was really the only path a man of the highest intellect could take in his time, from our vantage its nonsense - things have moved on - a lot.

BTW I know those video's from the Character of Physical Law by Feynman very very well. But reviewed it again. He is simply saying what I am saying - you can write QM purely as a mathematical system - no problem. But to apply it you need to map the things in the math to what you are applying it to. In QM its so easy you don't even have to spell it out in beginner or even intermediate textbooks - its simply left up in the air. It is assumed you know what an observation is etc. That's why they can study QM without any of the difficulties of interpretation - ie shut up and calculate. You can go a long way dong just that. But, just like what probability is in probability theory, if you want to delve deeper and understand it what it means when you apply it, you run into a morass. For example, exacty what is an observation. Since observations, supposedly anyway - its never spelled out in such treatments - occur here in the macro world how do you explain that macro world with a theory that assumes it's existence in the first place. They are only some of the many issues if you look deeply enough. To answer them we have interpretations. But it's wise to study them after you have at least done QM to the intermediate level - eg Griffiths would be an example - but Vanhees who teaches this stuff knows a better book whose name I can't recall - but anyway it's unimportant - the important thing is you need to study the 'intuitive' treatment before delving into the deep issues. BTW that's one reason Ballentine is so good - he does not skirt those issues. You may or may not agree with him - but he does not skirt them. However it is an advanced book you study after intermediate books like Griffiths.

Thanks
Bill
 
Last edited:
  • #100
bhobba said:
Newton - first principles - well let's look at those shall we:
Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration: relative, apparent and common time, is some sensible and external (whether accurate or unequable) measure of duration by the means of motion, which is commonly used instead of true time ...

What utter nonsense, and there is zero doubt Feynman would agree. I know those vireos you posted very well and they say nothing of the sort.

With all due respect to Newton of course who as Einstein said was really the only path a man of the highest intellect could take in his time.

But things have moved on.

Thanks
Bill
You somehow manage to twist and misunderstand everything I say. Nowhere did I imply that we still use the same first principles that Newton used, I said that physicists still use derivation from first principles as invented by Newton, i.e. we still use the method, which Newton invented, of mathematically deriving physical laws from first principles. Before Newton there simply was no such approach to physics and therefore no true inkling of physical law; in this sense it can be said that Newton invented (mathematical) physics.

In the video Feynman makes crystal clear that the physics approach to mathematics and how mathematics is used in physics to derive laws from principles and vice versa, is as far from formalist mathematician-type axiomatic mathematics as can be.
 
  • Like
Likes Fra
  • #101
Auto-Didact said:
You somehow manage to twist and misunderstand everything I say. Nowhere did I imply that we still use the same first principles that Newton used, I said that physicists still use derivation from first principles as invented by Newton,

And I could also say the same thing.

Didn't you get what was being inferred - we do not use the same methods as Newton because they do not work. We can't elucidate those 'first principles' you talk about, even for such a simple thing as what time is.

The modern definition of time is - its what a clock measures.

Wow what a great revelation - but as a first principle - well its not sating much is it beyond common sense - basically things called clocks exist and they measure this thing called time. It does however have some value - it stops people trying to do what Newton tried - and failed.

Want to know what the 'first principles' of modern classical mechanics is:

1. The principle of least action.
2. The principle of relativity.

Now, if what you say is true then you should be able to state those in your 'first principles' form. I would be very interested in seeing them. BTW 1. follows from QM - but that is just by the by - I even gave a non rigorous proof - see post 3:
https://www.physicsforums.com/threa...fication-of-principle-of-least-action.881155/

You will find it doesn't matter what you do, you at the end of the day end up with very vague, or even when looked at deeply enough, nonsensical statements. That's why it's expressed in mathematical form with some terms left to just common sense in how you apply it. In fact that's what Feynman was alluding to at the end of the second video you posted. Physics is not mathematics but is written in the language of math. How do you go from one to the other? Usually common-sense. But if you want to go deeper then you end up in a philosophical morass that we do not discuss here.

Thanks
Bill
 
  • #102
bhobba said:
And I could also say the same thing.

Didn't you get what was being inferred - we do not use the same methods as Newton because they do not work. We can't elucidate those 'first principles' you talk about, even for such a simple thing as what time is.

The modern definition of time is - its what a clock measures.

Wow what a great revelation - but as a first principle - well its not sating much is it beyond common sense - basically things called clocks exist and they measure this thing called time. It does however have some value - it stops people trying to do what Newton tried - and failed.

Want to know what the 'first principles' of modern classical mechanics is:

1. The principle of least action.
2. The principle of relativity.

Now, if what you say is true then you should be able to state those in your 'first principles' form. I would be very interested in seeing them. BTW 1. follows from QM - but that is just by the by - I even gave a non rigorous proof - see post 3:
https://www.physicsforums.com/threa...fication-of-principle-of-least-action.881155/

You will find it doesn't matter what you do, you at the end of the day end up with very vague, or even when looked at deeply enough, nonsensical statements. That's why it's expressed in mathematical form with some terms left to just common sense in how you apply it. In fact that's what Feynman was alluding to at the end of the second video you posted. Physics is not mathematics but is written in the language of math. How do you go from one to the other? Usually common-sense. But if you want to go deeper then you end up in a philosophical morass that we do not discuss here.

Thanks
Bill

I'm not sure that you understand that I am saying that our first principles are all literally mathematical statements - all our first principles are observational facts in the form of mathematical statements, the best example is the principle of general covariance.

This changes nothing of the fact that the physicist's method of doing mathematics and of deriving new laws looks almost nothing like the modern mathematician's axiomatic style of doing formal mathematics as is customary since Bourbaki; the gist of mathematics used in physics was maybe 'modern' in the 18th/19th century.

Derivation from first principles of physics is the de facto physicist technique of discovering physical laws and theories from principles based on observation in the form of mathematical statements. Derivation from first principles is however not rigorous mathematical proof based on axioms, nor will it ever be: these are two different methods of using mathematics, that is my point.

tl;dr mathematicians and physicists tend to use mathematics in completely different ways due to different purposes, this is a good thing.
 
  • #103
Auto-Didact said:
the best example is the principle of general covariance.

You know the principle of general covarience is wrong don't you (or rather is totally vacuous as first pointed out by Kretchmann to Einstein - and Einstein agreed - but thought it still had heuristic value)? But that is best suited to the relativity forum.

Its modern version is the principle of general invariance: All laws of physics must be invariant under general coordinate transformations.

Is that what you mean?

Then yes I agree. My two examples of the modern version of classical mechanics would fit that as well.

But I am scratching my head about why the principles I gave from Ballentine would not fit that criteria?

Thanks
Bill
 
  • #104
bhobba said:
Its modern version is the principle of general invariance: All laws of physics must be invariant under general coordinate transformations.

Isn't that what "the principle of general covariance" means?
 
  • Like
Likes Auto-Didact and atyy
  • #105
PeterDonis said:
Isn't that what "the principle of general covariance" means?

Not according to Ohanian:
https://www.amazon.com/dp/0393965015/?tag=pfamazon01-20

There has been long debate about it:
http://www.pitt.edu/~jdnorton/papers/decades.pdf
Ohanian (1976, pp252-4) uses Anderson’s principle of general invariance to
respond to Kretschmann’s objection that general covariance is physically vacuous. He
does insist, however, that the principle is not a relativity principle and that the general
theory of relativity is no more relativistic than the special theory (~257). Anderson’s
ideas seem also to inform Buchdahl’s (1981, Lecture 6) notion of ‘absolute form
invariance’.

Its Anderson's principle of invarience which technically is - The requirement that the Einstein group is also an invariance group of all physical systems constitutes the principle of general invariance.

I will need to dig up my copy Ohanian to give his exact definition of the two - ie invarience and covarience. What I posted is his definition of invarience which I will need to contrast to his definition of covarience. That will take me a couple of minutes - but need to go to lunch now. I will see if I can do it now, otherwise it will need to wait until I get back.

Added Later:
Found it - from page 373 of Ohanain - the Principle of General covarience is:
All laws of physics shall be stated as equations covarient with respect to general coordinate transformations.

The difference is, as Kretchmann showed (and others if I recall), any equation can be bought into covarient form, but invarience is stronger - the content of the law itself must be invariant under coordinate transformations - not just its form.

I think if anyone wants to pursue it further the relativity forum is the best place.

Thanks
Bill
 
Last edited:

Similar threads

  • Quantum Interpretations and Foundations
Replies
1
Views
347
  • Quantum Interpretations and Foundations
Replies
34
Views
1K
  • Quantum Interpretations and Foundations
11
Replies
376
Views
10K
  • Quantum Interpretations and Foundations
Replies
13
Views
3K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
4K
  • Poll
  • Quantum Interpretations and Foundations
Replies
10
Views
154
  • Quantum Interpretations and Foundations
Replies
14
Views
2K
  • Quantum Interpretations and Foundations
Replies
28
Views
3K
  • Quantum Interpretations and Foundations
10
Replies
343
Views
27K
  • Quantum Interpretations and Foundations
Replies
7
Views
695
Back
Top