A Is quantum theory a microscopic theory?

Click For Summary
Quantum theory is often viewed as a framework for understanding the microscopic constituents of matter, yet its minimal instrumental form primarily focuses on predicting macroscopic measurement outcomes rather than providing insights into the microscopic world. This perspective raises questions about whether quantum theory can truly be considered a microscopic theory, as it relies on macroscopic events for detection and measurement. Some argue that without a clear definition of "microscopic," discussions about the nature of quantum theory become tautological and unproductive. The debate highlights the distinction between theories that explain observable phenomena and those that address unobservable entities. Ultimately, the conversation underscores the complexities of defining the boundaries between microscopic and macroscopic realms in quantum mechanics.
  • #91
vanhees71 said:
Now, in view of this lack of any working deterministic theory, my simple question is, why it is considered a problem that nature seems to be "irreducibly probabilistic/random" in the precise sense defined by QT? Shouldn't it be most "realistic" to just accept this irreducible randomness?
Probability exists only subjectively. Randomness is just the absence of knowledge on what it will happen when we will do something we don't have the complete control on it. Outcomes are relative configurations of experimentation. In UV spectroscopy(Definite V,hv, Unknown size and position), Quantum Optic Experimenter(No definite Frequency, somewhat defined position and size-Localized), High Energy experimenter(Has "All"). In any setup, outcomes are incomplete. Randomness is only a feature and less complete picture than a temporal one.
 
  • Like
Likes artis and Demystifier
Physics news on Phys.org
  • #92
vanhees71 said:
Now, in view of this lack of any working deterministic theory, my simple question is, why it is considered a problem that nature seems to be "irreducibly probabilistic/random" in the precise sense defined by QT?

First, I would like to ask, if there is no collapse, where is the irreducible randomness in QT?

There is one reason I can think of why irreducible randomness might be problematic, but it's a concern that's far ahead of our current theorizing. The issue is computers can't create randomness – they always need an outside source. We can't write a RAND() function based off other primitive operations. If we try to figure out how the universe exists, probably how it created itself, there is no outside source to draw upon. So it's easier to imagine something evolved from nothing if no decision was ever made.
 
  • #93
Mentz114 said:
No. Atoms really do exist !

Maybe, the following comment by @atyy puts it in a nutshell:

"I think most scientists don't care whether atoms exist or not. One just makes a model, uses the model to make predictions, and if the predictions match experimental results closely enough, then the model is accepted as a good approximation of reality. Atoms are just the name for something in some model."

https://www.physicsforums.com/threads/how-do-we-know-atoms-exist.282832/post-2092560
 
  • Like
Likes artis, Auto-Didact, Demystifier and 4 others
  • #94
julcab12 said:
Probability exists only subjectively. Randomness is just the absence of knowledge on what it will happen when we will do something we don't have the complete control on it. Outcomes are relative configurations of experimentation. In UV spectroscopy(Definite V,hv, Unknown size and position), Quantum Optic Experimenter(No definite Frequency, somewhat defined position and size-Localized), High Energy experimenter(Has "All"). In any setup, outcomes are incomplete. Randomness is only a feature and less complete picture than a temporal one.
That's precisely what I asked! Why do you think that randomness is "just the absence of knowledge"? Why shouldn't nature behave randomly in a way as described by QT?
 
  • Like
Likes Lord Jestocost, Schwann and DarMM
  • #95
vanhees71 said:
That's precisely what I asked! Why do you think that randomness is "just the absence of knowledge"? Why shouldn't nature behave randomly in a way as described by QT?
I think I have commented on such question before. Because we believe in science and science tells us that there is a reason for everything. randomness with no reason seems utterly illogical. Now, if it is indeed that way, people want to know why, that's all.
 
Last edited:
  • Like
Likes artis
  • #96
akvadrako said:
First, I would like to ask, if there is no collapse, where is the irreducible randomness in QT?

There is one reason I can think of why irreducible randomness might be problematic, but it's a concern that's far ahead of our current theorizing. The issue is computers can't create randomness – they always need an outside source. We can't write a RAND() function based off other primitive operations. If we try to figure out how the universe exists, probably how it created itself, there is no outside source to draw upon. So it's easier to imagine something evolved from nothing if no decision was ever made.
I think this ties in with an issue we see in understanding quantum theory, is the universe an algorithm or describable by an algorithm on the fundamental level.
 
  • Like
Likes julcab12
  • #97
Lord Jestocost said:
Maybe, the following comment by @atyy puts it in a nutshell:

"I think most scientists don't care whether atoms exist or not. One just makes a model, uses the model to make predictions, and if the predictions match experimental results closely enough, then the model is accepted as a good approximation of reality. Atoms are just the name for something in some model."

https://www.physicsforums.com/threads/how-do-we-know-atoms-exist.282832/post-2092560
No. Atoms in materials exist. We can image them. Rutherford 'saw' gold atoms deflecting alpha-particles !
The link above is just one persons opinion. But anyone may believe whatever suits their particular view of the universe ...
 
  • #98
What I don't get is why there is structured, non-local randomness at the horizon (at the classical/quantum boundary).

And why is that horizon, though seemingly un-avoidable, dependent on some observer's frame of reference... why is it always waiting for some observer to be... picked but supposedly not involved in picking. I mean the problem with the Pilot wave, to my mind, is it suggests the future is mapped by a "wave". A wave isn't so offensive but how much of my future is determined by that wave? And why then isn't the present and past (mine for example) more like a wave? What I'm saying is there is a huge distance between that microscopic description and the classical reality we see. But all that reality is a function of that... thing (microscopic waves). So why the big difference in the description? I mean that's part of what we are struggling with here... the difference between what's going on down there and what we experience is profound.

Maybe a theory that had a better way of dealing with complicated (like more realistically sized) causality networks could better describe the rubbery space-time horizon - so it's not just a choice between an idealized microscopic (toy) Wave or a set of classical objects.

I mean the cool thing about multi-fractals that I can't get out of my head is they give you some math to create really rich mixtures of pure (or nearly pure) periodicity, pure (or nearly pure) randomness, and sets of things that are more classical seeming (unique-but-self-similar) objects. Are there any multi-fractal models of molecules?
 
Last edited:
  • Like
Likes DanielMB, DarMM and julcab12
  • #99
vanhees71 said:
That's precisely what I asked! Why do you think that randomness is "just the absence of knowledge"? Why shouldn't nature behave randomly in a way as described by QT?
Randonmess in its natural or mathematical form is a placeholder or almost meaningless. Absence of knowledge is a natural direction. Randonmess is always associated by incompleteness in a dynamical sense. Some considered it as placeholder like flat space in geometry. Flat doesn't hold in nature like randonmess. If we narrow it down. The only thing that's meaningful is interactions.
 
  • #100
ftr said:
I think I have commented on such question before. Because we believe in science and science tells us that there is a reason for everything. randomness with no reason seems utterly illogical. Now, if it is indeed that way, people want to know why, that's all.
It has meaning in its form. No more than a 'flat space' in geometry does.
 
  • #101
I don't think there is a "why" for the statistical character of quantum physics. You don't find the need to ask "why" classical physics is deterministic, so why here? All you can ask is, what removes the foundation for a deterministic description? A deterministic description requires the existence of an objective 'state' which can determine the future state. This is not possible anymore because quantum physics shows that you cannot have a space-time and energy-momentum description at the same time. For a deterministic description on the classical lines, you need to be able to have p and x at the same time.
 
  • Like
Likes vanhees71, Jimster41 and Lord Jestocost
  • #102
Mentz114 said:
No. Atoms in materials exist. We can image them. Rutherford 'saw' gold atoms deflecting alpha-particles !

What I wanted to say, expresses Paul Davies in his introduction to Werner Heisenberg’s “Physics and Philosophy” in the following words:

“Thus an electron or an atom cannot be regarded as a little thing in the same sense that a billiard ball is a thing. One cannot meaningfully talk about what an electron is doing between observations because it is the observations alone that create the reality of the electron. Thus a measurement of an electron's position creates an electron-with-a-position; a measurement of its momentum creates an electron-with-a-momentum. But neither entity can be considered already to be in existence prior to the measurement being made.”
 
  • Like
Likes microsansfil and DarMM
  • #103
PrashantGokaraju said:
I don't think there is a "why" for the statistical character of quantum physics. You don't find the need to ask "why" classical physics is deterministic, so why here? All you can ask is, what removes the foundation for a deterministic description? A deterministic description requires the existence of an objective 'state' which can determine the future state. This is not possible anymore because quantum physics shows that you cannot have a space-time and energy-momentum description at the same time. For a deterministic description on the classical lines, you need to be able to have p and x at the same time.

I like this concise reminder that time, microscopically is ...problemo. Things in the past are definitely in a sense at t,x. But, nothing in the past has momentum...? Is that right? Or at least this is what was bugging me yesterday reading this thread. Is there anything in the past that has momentum? Is there anything that has been measured that now has momentum... which is connected, in my mind, to the question of Cauchy surface conservation. If I measure something and put it at t,x. The momentum of that QM thing is conserved (or the energy involved in it). Some thing(s) got it. but they is all back down in QM? I could draw a network couldn't I to try to account for it, but then on one side of some line of incidence in that drawing there is a set of enumerable events at t's,x's, culminating in, causing, my event. On the other side of that line it's some nightmarish fuzz of Feynman diagrams?

Something is going to pull more events out of that fuzz. My event detector is part of it. But...
 
Last edited:
  • Like
Likes julcab12
  • #104
The thing is that, in classical physics, momentum can be defined in terms of space-time pictures

p = m dx/dt

This is an idealization, and is not exactly valid when seen in the light of the correct quantum mechanical description. The definition p = mv is valid in what is called the geometrical optics limit of quantum theory.
 
  • Like
Likes julcab12
  • #105
PrashantGokaraju said:
The thing is that, in classical physics, momentum can be defined in terms of space-time pictures

p = m dx/dt

This is an idealization, and is not exactly valid when seen in the light of the correct quantum mechanical description. The definition p = mv is valid in what is called the geometrical optics limit of quantum theory.

v being directional speed being a f(t), is that correct? Just want to make sure I followed that.
There was a big discussion about background time in QM, QFT etc in a recent thread. Definitely made my head spin. It seems tricky to suggest a background time for QM, QFT formulations when there isn't a fully realized QM theory of GR which to my mind is the bar for describing the concept of time.

I will shut up now. Great thread.
 
  • Like
Likes julcab12
  • #106
As an atomic experimentalist, most of my measurement experience relates to measuring the wavelength (or frequency) of laser light used for excitation, counting photons resulting from an event (usually with a photomultiplier tube), or counting electrons (usually with a channeltron or microchannel plate). Occasionally ions were counted and velocities determined with time of flight (delay between excitation event - laser pulse - and distant arrival.)

Now I can see how some of these measurements may be characterized as macroscopic - especially when table top optics are involved and there is enough light intensity to use photodiodes. But when one is counting single photons, electrons, or ions, these seems like fundamentally microscopic events - unless you are using a different definition of microscopic than I am.
 
  • Like
Likes vanhees71, dextercioby and julcab12
  • #107
Dr. Courtney said:
Now I can see how some of these measurements may be characterized as macroscopic - especially when table top optics are involved and there is enough light intensity to use photodiodes. But when one is counting single photons, electrons, or ions, these seems like fundamentally microscopic events - unless you are using a different definition of microscopic than I am.
The problem is that quantum theory seems to only talk about microscopic events provided they manage to get magnified up to the classical level. It doesn't reference microscopic events in and of themselves when no classical devices are around.
 
  • Like
Likes Demystifier, akvadrako and dextercioby
  • #108
Lord Jestocost said:
What I wanted to say, expresses Paul Davies in his introduction to Werner Heisenberg’s “Physics and Philosophy” in the following words:

“Thus an electron or an atom cannot be regarded as a little thing in the same sense that a billiard ball is a thing. One cannot meaningfully talk about what an electron is doing between observations because it is the observations alone that create the reality of the electron. Thus a measurement of an electron's position creates an electron-with-a-position; a measurement of its momentum creates an electron-with-a-momentum. But neither entity can be considered already to be in existence prior to the measurement being made.”
Well, I have read that and similar statements. This statement Thus an electron or an atom cannot be regarded as a little thing in the same sense that a billiard ball is a thing.
is self-evidently true (I believe I said something similar about atoms and baseballs) whereas the assertion But neither entity can be considered already to be in existence prior to the measurement being made is not a deduction from anything, merely speculation.

Furthermore, it is scientifically void because its truth cannot be tested by experiment. I see no reason to believe it.
 
Last edited:
  • #109
Dr. Courtney said:
As an atomic experimentalist, most of my measurement experience relates to measuring the wavelength (or frequency) of laser light used for excitation, counting photons resulting from an event (usually with a photomultiplier tube), or counting electrons (usually with a channeltron or microchannel plate). Occasionally ions were counted and velocities determined with time of flight (delay between excitation event - laser pulse - and distant arrival.)

Now I can see how some of these measurements may be characterized as macroscopic - especially when table top optics are involved and there is enough light intensity to use photodiodes. But when one is counting single photons, electrons, or ions, these seems like fundamentally microscopic events - unless you are using a different definition of microscopic than I am.
DarMM said:
The problem is that quantum theory seems to only talk about microscopic events provided they manage to get magnified up to the classical level. It doesn't reference microscopic events in and of themselves when no classical devices are around.

Not only that. Each type of experimenter yield a different reading. The photon that spectroscopy experimenter uses to explain how spectra are connected to the atoms and molecules is a different concept from the photon quantum optics experimenters talk about when explaining their experiments. Those are different from the photon that the high energy experimenters talk about and there are still other photons the high energy theorists talk about. There are probably even more variants (and countless personal modifications) in use. Definition really varies from any setup and how it reacts to different setup. Even in HEP experimenter's concept behind the readings is - a particle that cannot be observed directly, but is something having position, energy and momentum that helps explain interactions of charged particles among themselves and their behavior in external EM field (Compton's effect, pair creation). That the reason some think that 'maybe' the picture/detention is a phenomenon or mirage of a natural dynamics of interacting things.
 
  • #110
I can't see how a definition of microscopic that excludes all single photon and all single electron observations does not also exclude most (or all) observations of things that are traditionally considered microscopic: bacteria, viruses, cells, organelles, etc.
 
  • #111
Dr. Courtney said:
I can't see how a definition of microscopic that excludes all single photon and all single electron observations does not also exclude most (or all) observations of things that are traditionally considered microscopic: bacteria, viruses, cells, organelles, etc.
In the quantum formalism in its standard reading, photons are not spoken of alone in and of themselves when no macro devices are around. It's not so much that something is excluded from being microscopic, it's that the electron in quantum theory is spoken of in terms of micro-macro phenomena.

Viruses, cells, etc have frameworks that discuss them as they are when no microscopes are around and don't suffer from these issues.
 
  • #112
ftr said:
I think I have commented on such question before. Because we believe in science and science tells us that there is a reason for everything. randomness with no reason seems utterly illogical. Now, if it is indeed that way, people want to know why, that's all.
Well, the strength of science tells us first to be open to learn how nature behaves, and that in investigating this with using an interplay of quantiative reproducible observations and experiments and analytical and mathematical reasoning we may find that we have to give up prejudices about what we think to know. Nothing in this process is save against the possibility not to be in need for revision with new discoveries.

This happened indeed twice in the first half of the 20th century: One revision was necessary with the discovery that the electromagnetic phenomena are accurately described by Maxwell's theory of the electromagnetic field, but this theory was inconsistent with the "very sure knowledge" about the mathematical model describing spacetime (or at this time the strictly separated time and space) in terms of the Galilei-Newton model. This was such a "sacrosanct" piece of knowledge that it took about 50 years to resolve the issue finally and it involved the work of some of the most brillant mathematicians and physicists of their time (Poincare, Lorentz, Michelson, and finally Einstein): What had to be revised was not the idea of the special principle of relativity, i.e., the invariance of the physical laws under transformations from one inertial reference frame to another, but the very law of how the transformation had to be done, i.e., instead of Galileo transformations the Lorentz transformations (discovered in some predecessor form already by Voigt in the late 1880ies), implying a new spacetime model with space and time now no longer strictly separated but amalgamated into a pseudo-Euclidean affine spacetime manifold. This implies that also the laws of mechanics had to undergo a revision, and the corresponding revisions by Einstein (and corrections thereof by Planck, von Laue, and Einstein himself thereof) were after some experimental confusion resolved by experiments with the then also newly discovered electrons in gas-discharge tubes.

This turned out, however, to be a pretty harmless revision. Nothing of the considered really fundamental ideas, which you still more than 100 years later insist on, had to be revised. The physical laws still were strictly deterministic, i.e., any possible observable of any system by tacit assumption has always a determined value, and the principle of causality on the fundamental level holds in a very strong (time-local) form: Known the initial values of a complete set of observables (which simply is a set of observables all other observables are functions of) at some initial time ##t_0## their values in principle (and thus that of all possible observables, which are functions thereof) are determined at any later time ##t##.

Now this apparently save knowledge had to be revised with the advent of problems concerning phenomena becoming observable with the ever faster progress of technology. First these obstacles were considered minor issues. When Planck asked for advice what to study, a famous physics professor at the university told him, with his brillant grades from high school he shouldn't waste his time with physics, because this is all settled and the "little clouds" on the horizon would be simply resolved by measuring some fundamental constants of ever better precision and small revisions of the laws of classical physics (at the time consisting of classical (still Newtonian) mechanics, Maxwell electrodynamics, and (phenomenological) thermodynamics).

One of the clouds was not so new to begin with: It was the question of the absolute value of entropy and a theoretical understanding of what's now known as the Gibbs phenomenon in statistical physics, which however was already met with quite some skepticism by the more conservative physicists of the time, since they didn't even believe in the existence of the atomistic structure of matter to begin with. With the advent of low-temperature physics (with one milestone being Kammerlingh-Onnes achievement of liquifying Helium in the early 1900s) the issue became very evident: The specific heat of most substances did not behave as expected at low temperatures. Also when it became clear that metals had conduction electrons, but these didn't contribute to the specific heat even at room temperatures, another "cloud" arised at the horizon.

Then there was the problem of "black-body radiation", which was not describable with classical electrodynamics and thermodynamics/statistical physics. Famously this was how the quantum revolution started: When Rubens and Kurlbaum accurately measured the black-body radiation spectrum as a function of temperature, which was first an attempt to provide a better standard for measuring the efficiency of lightning sources (gas and the new electric light bulbs), lead Planck to guess the right law and subsequently with a brillant (but first not really understood) statistical analysis which only worked when assuming that each frequency mode of the electromagnetic field could exchange energy only in lumps of ##h \nu=\hbar \omega##. Already this was too much for Planck, but he couldn't find any other way to derive the accurate black-body law, named after him.

The rest of the story is well known. Einstein came with his light-quanta idea in 1905, then Bohr (completed by Sommerfeld) with his ad-hoc idea to explain the hydrogen spectrum from the atomic model enforced on the physics community by Rutherford's findings. Then the very stability of matter, the precise indistinguishability of atoms of each kind has become an enigma for classical physics. Then the ad hoc solution for hydrogen was found to be flawed, because it only worked for hydrogen. Another happy incident has it that it also works for the harmonic oscillator and thus lattice vibrations of solids, which resolved the problem with specific heat but not yet the question, why the conduction electrons in metals do not provide anything to the specific heat, while the model to describe the conduction electrons as a quasi free gas moving in a positively charged background worked very well in explaining Wiedemann-Franz's law of propotionality of electric and heat conductivity.

The up to day "final" resolution was modern quantum mechanics, discovered more or less independently in parallel no less than 3 times by (a) Heisenberg, Born, and Jordan (with some important help by Pauli) in terms of "matrix mechanics", (b) Schrödinger in terms of "wave mechanics", and (c) Dirac in terms of "transformation theory". Very early it was clear that these are just different mathematical expressions of the (so far one and only) full modern quantum theory. Even the idea of Jordan's not only to "quantize" the mechanics of particles but also the electromagnetic field (an idea that had to be rediscovered a few years later by Dirac since it was abandoned first as "overdoing the quantum revolution somwhat") turned out to be necessary to get the correct kinetic explanation of the Planck Law a la Einstein (1917) with the necessity to have not only absorption and induced emission but also spontaneous emission of "light quanta" also within the new theory, and up to today nobody has come with anything better.

Then, indeed there was a unique new issue, namely that of "interpretation", and this was solved (at least in my and that of most physicists' opinion solved) also very early by Born in a footnote in his paper describing the important wave-mechanical treatment of particle scattering: Schrödinger's wave function had to be interpreted probabilistically, ie.., not as a classical field describing a single electron, but as "probability amplitude" for finding the electron at a given position.

The theory thus turned to be perfectly causal, i.e., the quantum states, described by wave functions (and more generally by statstical operators) evolve according to a causal law (e.g., as given by the Schrödinger equation for the wave function), but the meaning of this state description is completely probabilistic, implying that observables like position and momentum (and other observables like energy and angular momentum) were in general not determined but only probabilities could be predicted which value will bemeasured given a state in terms of some preparation procedure, determining the wave function at some initial time (which implies how it has to look at a later time by solving the Schrödinger equation).

In my opinion, after all the decades of hard tests of this conjecture of "irreducible randomness" in the behavior of nature, including some of the most "weird-looking" implications (entanglement), it looks as if nature is indeed "random/indeterministic" on a fundamental level.
 
  • Like
  • Love
  • Informative
Likes Klystron, Schwann, Mentz114 and 2 others
  • #113
julcab12 said:
Not only that. Each type of experimenter yield a different reading. The photon that spectroscopy experimenter uses to explain how spectra are connected to the atoms and molecules is a different concept from the photon quantum optics experimenters talk about when explaining their experiments. Those are different from the photon that the high energy experimenters talk about and there are still other photons the high energy theorists talk about. There are probably even more variants (and countless personal modifications) in use. Definition really varies from any setup and how it reacts to different setup. Even in HEP experimenter's concept behind the readings is - a particle that cannot be observed directly, but is something having position, energy and momentum that helps explain interactions of charged particles among themselves and their behavior in external EM field (Compton's effect, pair creation). That the reason some think that 'maybe' the picture/detention is a phenomenon or mirage of a natural dynamics of interacting things.
This is not true at all. A photon is a photon, and it's described by relativistic QFT (applied to QED of course). There's no difference in the notion of a single photon (a one-photon Fock state) between HEP and quantum-optics physicists. Only the emphasis of the theoretical treatment is a bit different, but at the end the measurments are pretty much the same: A photon is registered in the one or other kind of macroscopic detector, be it a CCD cam of your smartphone or some em. calorimeter in one of the big experiments at the LHC.
 
  • Like
Likes Klystron and DarMM
  • #114
vanhees71 said:
Well, the strength of science tells us first to be open to learn how nature behaves, and that in investigating this with using an interplay of quantiative reproducible observations and experiments and analytical and mathematical reasoning we may find that we have to give up prejudices about what we think to know. Nothing in this process is save against the possibility not to be in need for revision with new discoveries.

This happened indeed twice in the first half of the 20th century: One revision was necessary with the discovery that the electromagnetic phenomena are accurately described by Maxwell's theory of the electromagnetic field, but this theory was inconsistent with the "very sure knowledge" about the mathematical model describing spacetime (or at this time the strictly separated time and space) in terms of the Galilei-Newton model. This was such a "sacrosanct" piece of knowledge that it took about 50 years to resolve the issue finally and it involved the work of some of the most brillant mathematicians and physicists of their time (Poincare, Lorentz, Michelson, and finally Einstein): What had to be revised was not the idea of the special principle of relativity, i.e., the invariance of the physical laws under transformations from one inertial reference frame to another, but the very law of how the transformation had to be done, i.e., instead of Galileo transformations the Lorentz transformations (discovered in some predecessor form already by Voigt in the late 1880ies), implying a new spacetime model with space and time now no longer strictly separated but amalgamated into a pseudo-Euclidean affine spacetime manifold. This implies that also the laws of mechanics had to undergo a revision, and the corresponding revisions by Einstein (and corrections thereof by Planck, von Laue, and Einstein himself thereof) were after some experimental confusion resolved by experiments with the then also newly discovered electrons in gas-discharge tubes.

This turned out, however, to be a pretty harmless revision. Nothing of the considered really fundamental ideas, which you still more than 100 years later insist on, had to be revised. The physical laws still were strictly deterministic, i.e., any possible observable of any system by tacit assumption has always a determined value, and the principle of causality on the fundamental level holds in a very strong (time-local) form: Known the initial values of a complete set of observables (which simply is a set of observables all other observables are functions of) at some initial time ##t_0## their values in principle (and thus that of all possible observables, which are functions thereof) are determined at any later time ##t##.

Now this apparently save knowledge had to be revised with the advent of problems concerning phenomena becoming observable with the ever faster progress of technology. First these obstacles were considered minor issues. When Planck asked for advice what to study, a famous physics professor at the university told him, with his brillant grades from high school he shouldn't waste his time with physics, because this is all settled and the "little clouds" on the horizon would be simply resolved by measuring some fundamental constants of ever better precision and small revisions of the laws of classical physics (at the time consisting of classical (still Newtonian) mechanics, Maxwell electrodynamics, and (phenomenological) thermodynamics).

One of the clouds was not so new to begin with: It was the question of the absolute value of entropy and a theoretical understanding of what's now known as the Gibbs phenomenon in statistical physics, which however was already met with quite some skepticism by the more conservative physicists of the time, since they didn't even believe in the existence of the atomistic structure of matter to begin with. With the advent of low-temperature physics (with one milestone being Kammerlingh-Onnes achievement of liquifying Helium in the early 1900s) the issue became very evident: The specific heat of most substances did not behave as expected at low temperatures. Also when it became clear that metals had conduction electrons, but these didn't contribute to the specific heat even at room temperatures, another "cloud" arised at the horizon.

Then there was the problem of "black-body radiation", which was not describable with classical electrodynamics and thermodynamics/statistical physics. Famously this was how the quantum revolution started: When Rubens and Kurlbaum accurately measured the black-body radiation spectrum as a function of temperature, which was first an attempt to provide a better standard for measuring the efficiency of lightning sources (gas and the new electric light bulbs), lead Planck to guess the right law and subsequently with a brillant (but first not really understood) statistical analysis which only worked when assuming that each frequency mode of the electromagnetic field could exchange energy only in lumps of ##h \nu=\hbar \omega##. Already this was too much for Planck, but he couldn't find any other way to derive the accurate black-body law, named after him.

The rest of the story is well known. Einstein came with his light-quanta idea in 1905, then Bohr (completed by Sommerfeld) with his ad-hoc idea to explain the hydrogen spectrum from the atomic model enforced on the physics community by Rutherford's findings. Then the very stability of matter, the precise indistinguishability of atoms of each kind has become an enigma for classical physics. Then the ad hoc solution for hydrogen was found to be flawed, because it only worked for hydrogen. Another happy incident has it that it also works for the harmonic oscillator and thus lattice vibrations of solids, which resolved the problem with specific heat but not yet the question, why the conduction electrons in metals do not provide anything to the specific heat, while the model to describe the conduction electrons as a quasi free gas moving in a positively charged background worked very well in explaining Wiedemann-Franz's law of propotionality of electric and heat conductivity.

The up to day "final" resolution was modern quantum mechanics, discovered more or less independently in parallel no less than 3 times by (a) Heisenberg, Born, and Jordan (with some important help by Pauli) in terms of "matrix mechanics", (b) Schrödinger in terms of "wave mechanics", and (c) Dirac in terms of "transformation theory". Very early it was clear that these are just different mathematical expressions of the (so far one and only) full modern quantum theory. Even the idea of Jordan's not only to "quantize" the mechanics of particles but also the electromagnetic field (an idea that had to be rediscovered a few years later by Dirac since it was abandoned first as "overdoing the quantum revolution somwhat") turned out to be necessary to get the correct kinetic explanation of the Planck Law a la Einstein (1917) with the necessity to have not only absorption and induced emission but also spontaneous emission of "light quanta" also within the new theory, and up to today nobody has come with anything better.

Then, indeed there was a unique new issue, namely that of "interpretation", and this was solved (at least in my and that of most physicists' opinion solved) also very early by Born in a footnote in his paper describing the important wave-mechanical treatment of particle scattering: Schrödinger's wave function had to be interpreted probabilistically, ie.., not as a classical field describing a single electron, but as "probability amplitude" for finding the electron at a given position.

The theory thus turned to be perfectly causal, i.e., the quantum states, described by wave functions (and more generally by statstical operators) evolve according to a causal law (e.g., as given by the Schrödinger equation for the wave function), but the meaning of this state description is completely probabilistic, implying that observables like position and momentum (and other observables like energy and angular momentum) were in general not determined but only probabilities could be predicted which value will bemeasured given a state in terms of some preparation procedure, determining the wave function at some initial time (which implies how it has to look at a later time by solving the Schrödinger equation).

In my opinion, after all the decades of hard tests of this conjecture of "irreducible randomness" in the behavior of nature, including some of the most "weird-looking" implications (entanglement), it looks as if nature is indeed "random/indeterministic" on a fundamental level.

vanheez71, did you write all of the above today and spontaneously?

I think you can be a good chronicler or blogger of the next revolution in physics. We are like in 1899 now before the Planck started the quantum revolution. It's deja vu all over again.
 
  • #115
Well... the task you set out in the question details, that is, to define quantum theory "as pertaining to appreciation by the senses", is quite impossible.

You see, quantum theory describes precisely what matter does when it is notbeing sensed (by a human or any other classical instrument). That, really, is the essence of quantum theory: that systems, when they do not interact with classical entities, are in states that have no classical equivalent.

The moment you attempt to make sense of a quantum system using the intuition of classical senses, it ceases to be a quantum system. So while I believe it is possible to develop an intuition for quantum physics, this intuition necessary has to be abstract, not relying on concepts related to our senses.
 
  • #116
vanhees71 said:
In my opinion, after all the decades of hard tests of this conjecture of "irreducible randomness" in the behavior of nature, including some of the most "weird-looking" implications (entanglement), it looks as if nature is indeed "random/indeterministic" on a fundamental level
I think we need to add two elements to this as fundamental randomness alone would be satisfied by a normal stochastic process. We have to add incompatibility, i.e. the uncertainty principle. And also the requirement of macro devices as the quantum formalism does not give a probability for a photon to develop a certain spin component say without a classical device measuring it.
 
  • #117
lucas_ said:
vanheez71, did you write all of the above today and spontaneously?

I think you can be a good chronicler or blogger of the next revolution in physics. We are like in 1899 now before the Planck started the quantum revolution. It's deja vu all over again.
Yes, I wrote this just spontaneously. That's why for sure it's far from being accurate, but that's the beauty of forums like this. You can just exchange some ideas :-)).
 
  • Like
Likes Jimster41
  • #118
DarMM said:
I think we need to add two elements to this as fundamental randomness alone would be satisfied by a normal stochastic process. We have to add incompatibility, i.e. the uncertainty principle. And also the requirement of macro devices as the quantum formalism does not give a probability for a photon to develop a certain spin component say without a classical device measuring it.
But all this IS what's described by QT. It's a kind of probability theory adapted to the real world, discovered by the scientific method of observation and mathematical modeling. I don't know what you mean by a photon is developing a certain spin. I guess you mean how to get specific polarization states? That's not so difficult, as far as photons in the range of visible light is concerned: Just take the well-known optical devices like polaroid foil to get linearly polarized light to get linearly polarized photons and then devices like quarter-wave plates etc. to create any polarization state.
 
  • #119
Demystifier said:
No, I imply that all detections are macroscopic. But the converse is not true, some macro objects may not be detections.
So then do you agree with @atyy ’s definition of micro/macro and the notion that a classical microscopic theory is impossible?
 
  • #120
vanhees71 said:
This is not true at all. A photon is a photon, and it's described by relativistic QFT (applied to QED of course). There's no difference in the notion of a single photon (a one-photon Fock state) between HEP and quantum-optics physicists. Only the emphasis of the theoretical treatment is a bit different, but at the end the measurments are pretty much the same: A photon is registered in the one or other kind of macroscopic detector, be it a CCD cam of your smartphone or some em. calorimeter in one of the big experiments at the LHC.
Of course a photon is a photon. For instance, QFT - 'thing' from the mode expansion of free fields, free relativistic field that fulfills the Klein-Gordon equation, a Fourier transform. The word Quantum--Photon in quantum theory-- colloquial and technical. My point is that, there is no ontological cut with that "thing". All the modern variants allow for creation and destruction. We can talk of the same thing but registered different readings with each experimental setup. It's not about lack of consistency here. We can register position and delocalization on a normal basis but to HE experimenter can only read tracks and scattering events.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
Replies
36
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
307
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K