I Nature Physics on quantum foundations

  • #51
A. Neumaier said:
What is the difference between real and objective? How can something nonreal be objective, and how can something nonobjective be real?
Maybe there is ultimately no distinction, in the sense that an objective quantum state is a representation of some objective character about the system. The distinction I had in mind was nomological vs material (something could be objective but not material), similar to the distinction Goldstein et al make here:

"We propose that the wave function belongs to an altogether different category of existence than that of substantive physical entities, and that its existence is nomological rather than material. We propose, in other words, that the wave function is a component of physical law rather than of the reality described by the law"
 
  • Like
Likes Peter Morgan, physika and Demystifier
Physics news on Phys.org
  • #52
A. Neumaier said:
How can something nonreal be objective
In this context I think a good example is Lagrangian in classical mechanics. It is objective in the sense that it does not depend on the observer, but nonreal in the sense that it is only a mathematical tool to compute properties of real physical classical objects such as particles.
 
  • #53
Demystifier said:
In this context I think a good example is Lagrangian in classical mechanics. It is objective in the sense that it does not depend on the observer, but nonreal in the sense that it is only a mathematical tool to compute properties of real physical classical objects such as particles.
The Lagrangian depends on the observer, as anyone is free to add a total derivative without changing the dynamics. Thus it is like coordinates.
 
  • #54
A. Neumaier said:
The Lagrangian depends on the observer, as anyone is free to add a total derivative without changing the dynamics. Thus it is like coordinates.
That's not what "dependence on observer" in physics means. Even dependence on coordinates, in general, does not imply dependence on observer. For example, if something is not invariant under the transformation from Cartesian to spherical coordinates, it has nothing to do with dependence on observer. The dependence on observer refers to transformations that can be interpreted as physical changes of the observer, for example a spatial translation (corresponding to an observer translated in space), a rotation (corresponding to a rotated observer), or a boost (corresponding to an observer moving with a velocity). You can translate, rotate or boost the observer, but you cannot add a total derivative to the observer.
 
  • #55
Demystifier said:
The dependence on observer refers to transformations that can be interpreted as physical changes of the observer, for example a spatial translation (corresponding to an observer translated in space), a rotation (corresponding to a rotated observer), or a boost (corresponding to an observer moving with a velocity). You can translate, rotate or boost the observer, but you cannot add a total derivative to the observer.
Well, much more depends on observers! Different observers do not even get identical measurement results (unless the observers are idealizied).
 
  • #56
A. Neumaier said:
Well, much more depends on observers! Different observers do not even get identical measurement results (unless the observers are idealizied).
Since we don't know on which specific property of the observer it depends, we usually interpret it as statistical measurement errors.
 
  • #57
Demystifier said:
Since we don't know on which specific property of the observer it depends,
This area you mention is what to me is one of thw open question in an intrinsic theory of measurement. Ie to generalize quantum process tomography, constrained to using only information ar hans to an inside observer. If one elaboratea this, the observer choice will influence much more than just spacetime transformations, as the observers internal structure at depth will influence what ia optimally inferred (via the generalization of "process tomography" but this process ia not understood yet.)

/Fredrik
 
  • #58
I read the Frölich paper and I fail to connect to this thinking or choice of analysis in any significant sense. Perhaps I missed something but I see some some vauge exceptions...

"We must therefore clarify what should be added to the formalism of QM in order to capture its fundamentally probabilistic nature and to arrive at a mathematical structure that enables one to describe physical phenom-
ena (“events”) in isolated open systems S, without a need to appeal to the intervention of “observers” with “free will” – as is done in the conventional “Copenhagen Interpretation of QM”


Indeed one could ask what the freedom to choose detector settings in bell type of gedanken experiments, translates to, when imagine describing the WHOLE system must evolve unitarily? ie. when we try to include and "agent" in the system, but described from the perspective of another agent, what does the "freedom to choose measurement" correspond to? I share the idea that this is indeed a kind of random process. Ie. the agents making measurements must be a kind of spontaneous and random stoastich process. In my personal views, I see the agent as doing a random walk (or basically throwing dice). So the "free will" is allowed from the perspective of the external agent, but from the agent itself I think it's just doing a random walk. If we label the freedom to make a random step as free will, then it does not take anything else. But of course the random walk could be "guided", but the agents subjective bias. So from the external agent, it doesn't not necessarily appear random as randomness would be subjective. Randomness just means inability to predict, which may be due to limited information processing capacity, not too dissimilar to pseudorandom generators.

"(H;U) do not tell us anything interesting about the physics of S, beyond spectral properties of the operators U(t; t0)"

If I interpret what they want to say, they say the Hamiltonian does not say anything about the "internal structure" of S, and thus the "physics of the internal interactions". I symphatize with this, as the hamiltonian is inferred "as a whole" from the outside, which is why for complex systems it lacks insight of the origin, and often bings us into a fine tuning situation. But I do no not see how the EHT view solves anything as i see it. I would prefer to phrase this subquesion so that, if S containts of "interacting observers", then to understand the physics of S (and how it's parts are put together) we need to understand the physics of interacting observers on part with any interacting and to construct larges systems from parts, from allowing the parts to "communicate" and see how the Hamiltoninan of such a system emerges from it's parts. This would give us the insight of the physics of S, AND the overally hamiltonian of the composite S; as seen from an external perspective. But this to me, requires a new theory, and I do not how their EHT stance helps out in that quest?

The beef with how unitary evolution of the whole system, may not be consistent with the stepwise evolution with internal measurements, where one assumes that that the classical results obey the bell-type correlations does not seem like a problem to me as the latter sitation is injecting information that does not exist in the original state, so there is not reason why the two expectations should be the same, as I don't consider the latter case an isolate system, so there is no paradox. That the "expectations" on a isolated system, is violated when the assumption of isolation is broken, is not a conceptual problem.

/Fredrik
 
  • #59
Fra said:
Indeed one could ask what the freedom to choose detector settings in bell type of gedanken experiments, translates to, when imagine describing the WHOLE system must evolve unitarily? ie. when we try to include and "agent" in the system, but described from the perspective of another agent, what does the "freedom to choose measurement" correspond to? I share the idea that this is indeed a kind of random process. Ie. the agents making measurements must be a kind of spontaneous and random stoastich process.
I don't like the use of the word "spontaneous" in this context. In a FAPP sense, we do have "freedom to choose measurement" in modern Bell experiments, but not in a "spontaneously random" way. We use our freedom beforehand to decide on a protocol from where to take the randomness. But it is never an instantaneous randomness, but always processes where some uncertainty in time is present regarding the moment when the decision got determined.

Fra said:
So from the external agent, it doesn't not necessarily appear random as randomness would be subjective. Randomness just means inability to predict, which may be due to limited information processing capacity, not too dissimilar to pseudorandom generators.
I fear saying "randomness is subjective" without also being willing to take and defend some form of Bayesian interpretation is too lazy. It is simply not the same as saying that "randomness hardly ever absolute".
 
  • #60
A. Neumaier said:
The Lagrangian depends on the observer, as anyone is free to add a total derivative without changing the dynamics. Thus it is like coordinates.
Well, then take the quotient structure given by the equivalence relation that the difference is a total derivative.

OK, I know that taking the quotient is easier said than done. Sometimes it miraculously just works, like for identical particles in Bohmian mechanics. Othertimes it doesn't "really" work properly, like for spacetime foliations in Bohmian mechanics. And you can never be sure whether other people really agree that taking some quotient is the right thing to do, or even "necessary" in the first place. Especially in cases where the quotient seems to make trouble like for spacetime foliations, the number of people willing to bite the bullet and claim that there should be one objective preferred foliation (instead of trying to fix the quotient) quickly grows.

But I find it a bit unfair to attack only the Bohmians in this respect. I think the problem already occurs in mathematics itself for the topological quotient space. Sometimes it is a "patchwork construction", for example when arbitrarily gluing different spaces together, or gluing borders of a single space together to get a completely different space. And sometimes it is a "natural construction", like when taking the quotient by a discrete subgroup which operates continuously on the space.
 
  • #61
vanhees71 said:
Quantum theory is NOT weird but the most comprehensive theory about Nature we have today.

Wave-particle duality is no phenomenon but a theoretical concept that's outdated for about 100 years.
Quantum Foundations are essential if you think the purpose of science is understanding the world, but not getting right what the actual issues are or why they are even an issue is vital in discussions about it. We have physicists/philosophers like David Wallice that get the problems right in tomes like the Emergent Universe. I do not entirely agree with David, but it is an exciting book by someone who understands physics and philosophy (he has a PhD in physics and philosophy). It is also helpful in understanding other interpretations like Consistent Histories. I agree with Gell-Mann about what many-worlds mean:



Gell-Mann's approach, now called Decoherent Histories, has produced some interesting insights into the emergence of a classical world:

https://www.sciencenews.org/blog/context/gell-mann-hartle-spin-quantum-narrative-about-reality.

These are examples of important work in the area, along with things like Bell's Theorem, which many also get wrong. Again I side with Gell-Mann:



It is just that the scholarship of some, IMHO, because it is pretty hard, is not what it should be. I have fallen for it myself in my musings about QM being understood as something where we interact with quantum systems to know about the quantum word. You were correct in pointing out that rapid progress is being made in applications where this may no longer be true, so it fundamentally can't be understood that way. It may be a helpful idea in motivating its formalism as a mathematical model, but as an explanation is flawed.

Thanks
Bill
 
Last edited:
  • Like
Likes physika, PeroK, Peter Morgan and 1 other person
  • #62
ohwilleke said:
You can't viscerally experience phenomena associated with quantum but not classical physics without scientific instrumentation.

I am not sure that is something fundamental in QM with the rapid advances being made in applications.

Thanks
Bill
 
  • #63
bhobba said:
These are examples of important work in the area, along with things like Bell's Theorem, which many also get wrong. Again I side with Gell-Mann:
Thanks for that link to Gell-Mann, @bhobba. I side with him to a considerable extent, but I think he misses the different way of understanding Bell's theorem, as well as the Gleason and Kochen-Specker theorems, that I suggest in the article in JPhysA 2022 that I link to above: we can take those theorems to prove that Classical Mechanics is incomplete, the opposite of the usual worry that QM is incomplete. If we take that opposite view, I think we have to ask how we can complete Classical Mechanics, to which there is at least one useful answer: we can use the Poisson bracket, in a very natural way, to construct a noncommutative version of CM, which I call CM+ in Annals of Physics 2020, "An algebraic approach to Koopman classical mechanics", https://arxiv.org/abs/1901.00526 (DOI there). The basic idea is expressed fairly concisely in a slide from a talk I gave to a particle physics seminar in Bogotá in May,
1663859599462.png

To make CM as resourceful as QM, we also have to distinguish between quantum noise and thermal noise by considering their different properties under Lorentz symmetries, but, as far as I currently know, those two changes are enough. If CM+ with quantum noise were exactly the same as QM, this would be uninteresting, but they are different enough to illuminate the measurement problem, which I address in JPhysA 2022, "The collapse of a quantum state as a joint probability construction", although, sadly, not as clearly as I'd like. If anyone cares to look, I recently uploaded the whole set of slides for that talk to Academia, https://www.academia.edu/86450002/The_connection_between_QFT_and_random_fields. My feeling is that von Neumann could have done something very like all this in 1932, right after Koopman pointed out to him how to construct a Hilbert space formalism for CM, which von Neumann and Birkhoff used immediately to prove two versions of the ergodic theorem, but then it was mostly forgotten until Sudarshan in 1976. Even after that, the Koopman formalism has almost only been used for chaos theory-type applications instead of using its full power to better understand the relationship between classical and quantum mechanics. Ideas about QM are entrenched enough, however, that it's something of an uphill battle to bring even published work in quite good journals to people's attention enough for them to tell me where they think what I'm doing seems helpful or wrong-headed.
 
  • Like
  • Informative
Likes Lord Crc, vanhees71 and bhobba
  • #64
gentzen said:
I don't like the use of the word "spontaneous" in this context. In a FAPP sense, we do have "freedom to choose measurement" in modern Bell experiments, but not in a "spontaneously random" way. We use our freedom beforehand to decide on a protocol from where to take the randomness. But it is never an instantaneous randomness, but always processes where some uncertainty in time is present regarding the moment when the decision got determined.
By spontaneous I meant to imply that there is an arrrow of time here, the agents learning process / natures self-organisation should be a dual description the evolution in time. The random walk has a "direction".

An agent have the "freedom" to make bad choices, but are such agents likely to be abundant? The freedom we have in designed experiments are of course not "natural" or "spontaneous" from our own perspective, but in theory, human experimenting, must be a part of natures self-organisation, and thus spontaneous in a larger context.
Put a sales agent into the market, he has the "freedom" to do wherever he wants! But the abundant ones tend to follow the money(on average)! This is how I see "freedom". A freedom subject to soft constraints, as there will be a selection in favour of "construtive" choices. I view the "quantum agents/observer" in the same way. Any observer is "allowed" a priori, but not all are abundant! It's similar to saying that there are many crazy hypothetical particels that could exist, but they just aren't observed for a reason.

gentzen said:
I fear saying "randomness is subjective" without also being willing to take and defend some form of Bayesian interpretation is too lazy.
Of course I defend the bayesian stance :)

/Fredrik
 
  • #65
Peter Morgan said:
we can take those theorems to prove that Classical Mechanics is incomplete, the opposite of the usual worry that QM is incomplete. I

Interesting. I know Gleason well (KS is a simple corollary) and will need to look into that when I get some time.

Thanks
Bill
 
  • #66
gentzen said:
Sometimes it miraculously just works, like for identical particles in Bohmian mechanics.
Can you elaborate a bit?
 
  • Like
Likes bhobba and vanhees71
  • #67
Demystifier said:
Can you elaborate a bit?
Sure. Perhaps I should have written "indistinguishable particles" instead of "identical particles". I will elaborate it for Bosons, so that I can ignore the wavefunction for the equivalence relation. Let the trajectories of the ##n## indistinguishable Bosons be ##(x_1(t), \ldots, x_n(t))\in\mathbb R^{3n}##. We want (at least) the equivalence relation that ##(x_{\pi(1)}(t), \ldots, x_{\pi(n)}(t))## is equivalent to ##(x_1(t), \ldots, x_n(t))## for each permutation ##\pi\in S_n##. This "just works," if the wavefunction is invariant under those permutations.

A skeptic might object that we can still use the continuous trajectories to identify a specific particle between different points in time ##t_1## and ##t_2##. Good, but we can prevent that too, by using a "bigger" equivalence relation, on a slightly different space. For example, we can interpret the trajectories as a function ##(x_1, \ldots, x_n)(t) : \mathbb R \to \mathbb R^{3n}## and consider "piecewise constant" permutations ##\pi(t):\mathbb R \to S_n## for the equivalence relation. So ##(x_1, \ldots, x_n)(t)## is "declared" equivalent to ##(x_{\pi(t)(1)}, \ldots, x_{\pi(t)(n)})(t)##. No additional restrictions on the wavefunction are needed (besides those already imposed for the simpler equivalence relation), and now even a specific particle can no longer be identified between different points in time.

Of course, a skeptic might have further objections, but they can all be addressed in one way or another, basically because the quotient somehow "miraculously just works" in this case.
 
  • Like
Likes Demystifier
  • #68
Of course, it's much more efficient to simply not use any kind of trajectories as in the dBB interpretation. They do not provide anything physical to QT anyway. You may solve some philosophical quibble but introduce more complication without gaining any new insights from a scientific point of view.
 
  • #69
vanhees71 said:
Of course, it's much more efficient to simply not use any kind of trajectories as in the dBB interpretation. They do not provide anything physical to QT anyway. You may solve some philosophical quibble but introduce more complication without gaining any new insights from a scientific point of view.
Well, the most efficient way to avoid a need for dBB trajectories is to accept the collapse postulate. :-p

But you don't accept the collapse, hence you don't always strive for efficiency.
Which, indeed, is what makes you a scientist, otherwise you would be just an engineer. Inefficiency is therefore scientific, which implies that dBB interpretation is scientific.

Science is a mixture of engineering and philosophy. Sometimes it strives for efficiency (like engineering), and sometimes for conceptual depth (like philosophy). But when in a danger of looking too much like philosophy, science pretends to be engineering; and when in a danger of looking too much like engineering, it pretends to be philosophy.
 
  • Like
Likes physika, ohwilleke and andrew s 1905
  • #70
There's no need for any collapse postulate either. What happens to the measured system in the process of measurement cannot be postulated anyway since it depends on the measurement apparatus. E.g., when registering a photon with via the photoeffect the photon is absorbed and for sure its state has not "collapsed" somehow magically to an eigenstate of the measured observable.

Science is a mixture of engineering (preparation and measurements in the lab) and math (theory/model building by a theorist behind his or her desk ;-)).

Philosophy is a method to invent problems which are not there and then to confuse the scientists about their own well-understood work ;-)).
 
  • #71
There's indeed no need for any collapse postulate in an ontological sense. Berthold-Georg Englert, Marlan O. Scully and Herbert Walther in “Quantum erasure in double-slit interferometers with which-way detectors” (American Journal of Physics, 1999):

We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge about the entangled atom-and-photon system; in conjunction with the known dynamics, it enables us to make correct predictions about the statistical properties of future measurements. And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning to
##|\Psi>(x)##, thereby accommodating some philosophical preconceptions or other personal biases. In doing so, one should however remember van Kampen’s caveat: Whoever endows the state vector with more meaning than is needed for computing observable phenomena is responsible for the consequences (Theorem IV in Ref. 7).” [bold by LJ]
 
  • Like
Likes vanhees71 and bhobba
  • #72
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense.
...
We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge
...
And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
...
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning
...
more meaning than is needed for computing observable phenomena
This presumes that the CONTEXT of the computation (ie. the computer hardware) lacks physical basis and thus ontology. ie that "information processing" is not a physical phenomenon, but a human endavour. Even if it makes sense to think of the quantum STATE as non-physical, the state itself is defined by the state of the encoding context.

So in order to compute something, the algorithm is not sufficient, one also need a processing device to run it on.

In the Copenhagen intepretation, this "hardware" is in the effectively classical environment. In other views it may correspond to the shift in the state of the agent (which is part of the environment, anyway). So I think the ontology of the collapse refers to the ontology of the coding context (environment or agent), not the system itself. But this is also important ontology IMO; which is one of the reasons for my own choice of interpretation. Denying the "ontology of the observer" just does not make sense.

/Fredrik
 
  • #73
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense.

Of course. The Quantum state, especially when you consider Gleason, just allows the calculation of probabilities. It is an aid mathematically required by modelling the results of observations as the eigenvalues of an operator (yes, non-contextuality and a few others like the strong law of superposition are required, but they all are quite intuitive). It is like medium-term climate models that face the same issue as longer-term ones. Due to chaos, all you can do is predict the probabilities of, say, a cyclone forming in the pacific ocean hitting a particular city, such as where I live in Brisbane. You wake up one morning, open the window and see it did not hit where you live. What collapsed there? There is a genuine issue with the state, as elucidated by the PBR theorem, but it requires its own thread.

Thanks
Bill
 
Last edited:
  • Like
Likes Peter Morgan
  • #74
vanhees71 said:
There's no need for any collapse postulate either.
Of course, everything can be done without collapse, but from a practical point of view it makes things more complicated. The collapse postulate is the most efficient way to do the job. For example, books on quantum computation for engineers all use the collapse postulate.
 
  • #75
vanhees71 said:
Science is a mixture of engineering (preparation and measurements in the lab) and math (theory/model building by a theorist behind his or her desk
No, because math is a part of engineering too.
 
  • #76
Lord Jestocost said:
There's indeed no need for any collapse postulate in an ontological sense. Berthold-Georg Englert, Marlan O. Scully and Herbert Walther in “Quantum erasure in double-slit interferometers with which-way detectors” (American Journal of Physics, 1999):

We recall: The state vector ##|\Psi>(x)## serves the sole purpose of summarizing concisely our knowledge about the entangled atom-and-photon system; in conjunction with the known dynamics, it enables us to make correct predictions about the statistical properties of future measurements. And a state reduction must be performed whenever we wish to account for newly acquired information about the system. This minimalistic interpretation of state vectors and their reduction is common to all interpretations; it is forced upon us by the abundance of empirical facts that show that quantum mechanics works.
Of course, one might try to go beyond the minimalistic interpretation and give additional ontological meaning to
##|\Psi>(x)##, thereby accommodating some philosophical preconceptions or other personal biases. In doing so, one should however remember van Kampen’s caveat: Whoever endows the state vector with more meaning than is needed for computing observable phenomena is responsible for the consequences (Theorem IV in Ref. 7).” [bold by LJ]
Although I love all papers by any of those authors, one must stress that the notation ##|\Psi \rangle(x)## is an utmost serious sin! ##|\Psi \rangle## is a normalized vector in Hilbert space, describing the pure state ##\hat{\rho}=|\Psi \rangle \langle \Psi |##. The wave function is its representation as (generalized) components of (generalized) position eigenvectors, i.e., ##\Psi(x)=\langle x|\Psi \rangle##. Otherwise the quoted statement is, of course, completely right.
 
  • Wow
  • Informative
Likes Peter Morgan, Demystifier and Lord Jestocost
  • #77
vanhees71 said:
the notation ##|\Psi \rangle(x)## is an utmost serious sin!
Indeed, terrible notation!
 
  • Haha
Likes Peter Morgan
  • #78
bhobba said:
Of course. The Quantum state, especially when you consider Gleason, just allows the calculation of probabilities. It is an aid mathematically required by modelling the results of observations as the eigenvalues of an operator (yes, non-contextuality and a few others like the strong law of superposition are required, but they all are quite intuitive). It is like medium-term climate models that face the same issue as longer-term ones. Due to chaos, all you can do is predict the probabilities of, say, a cyclone forming in the pacific ocean hitting a particular city, such as where I live in Brisbane. You wake up one morning, open the window and see it did not hit where you live. What collapsed there? There is a genuine issue with the state, as elucidated by the PBR theorem, but it requires its own thread.

Thanks
Bill
May I emphasize, @bhobba, that it's A quantum state that "just allows the calculation of probabilities", not "The quantum state"? In practice, we use a different Hilbert space for every different kind of experiment, not just a different state as a model for a given state preparation. It looks from your writing here that you might be open to that difference, but to me there's a pragmatic emphasis in saying "A" for which I think it's worth taking the trouble. If you think this is a difference you think is irrelevant, I'd like to know what your reasons are because I consider this to be fundamental to my choice of title for my article "The collapse of a quantum state as a joint probability construction" (no emphasis in the original, though I'd like there to have been). The question of the Heisenberg cut, for example, can be thought to be about what Hilbert space we "ought" to use to describe an experiment, and hence what quantum states and measurement operators are to be considered candidate models for our apparatus, with no evidently correct answer except for tractability.
An aspect of quantum (field) theory that has come to seem strange to me is that because measurements at time-like (or light-like) separation in general do not commute, the straightforward calculation of joint probabilities at time-like separation is not possible. Instead, we compute joint probabilities at time-like separation by computing the probability of the earlier measurement results, collapsing the state, and then computing the probability of the later measurement results in that new state. That rigamarole gives us a joint probability, whereas if we allowed ourselves to use operators that commute at time-like separation, which I claim is a conventional choice, we could obtain a joint probability without a collapse.
I think the later comments you make about chaos are well-observed, but I point out that the academic literature on chaos theory increasingly uses Koopman's Hilbert space formalism. It's interesting that that literature makes only very limited use of noncommutativity (mostly only for what's called the Koopman operator, which is an integrated form of the Liouvillian operator), which I suppose may be because the people who work on chaos theory have a relatively classical mindset and may even be deliberately avoiding the questions raised by quantum measurement theory, but that doesn't have to be so. Noncommutativity can be used in CM as freely as it is used in QM if we decide to allow it. I won't elaborate on it here, but there is a good mathematical reason to use quantum mechanics: it has a mathematically very useful analyticity in its description of the dynamics of probabilities that classical mechanics does not have.
I've never really understood what gets a new thread on PF. Conversations often go away from the starting point so fast it blows my mind. I can't say I help keep things on track with my own preoccupations:frown: The starting point here, "Nature Physics on quantum foundations", initially attracted me because I was annoyed by that editorial enough that I noticed when it came up on PF.
 
  • Like
Likes mattt and Fra
  • #79
Concerning the original post, "Nature Physics on quantum foundations", I was annoyed enough by that editorial that I wrote to nature physics that I felt they don't understand the foundations of QM literature at all if they could begin with "Quantum physics is weird". On September 9th, I posted the following to my Facebook page, which for better or worse is what I mostly use for my whiteboarding these days (you can be thankful I mostly don't use PF, right?),

I'm amazed by this, an editorial in Nature Physics, https://rdcu.be/cVglj (that link should give access to the article, which as a DOI link is https://doi.org/10.1038/s41567-022-01766-x ). It begins with "Quantum physics is weird", which any physicist who is involved in the recent literature ought to know is now enough in question that we are into a new era, so much so that even a popular book from Philip Ball, reflecting that literature, is titled "Beyond Weird".​
Here's the paragraph before last [of the editorial in Nature Physics],​
"Although a fresh view can invigorate any field, much of this work also manifests a disregard for the progress that has been made since quantum mechanics was established. The quantum foundations literature is the product of decades of careful thought about the issues involved in understanding and interpreting the physical world. As with any topic, a failure to constructively engage with existing work runs the risk of repeating earlier mistakes and misunderstandings."​
I entirely agree that there has been remarkable progress over the last Century. I leave as an exercise, however, for anyone who has followed my published work of the last few years, "What literature have they failed to engage with?!?" I can forgive the editors of Nature Physics because I know very well that my work is not perfectly clear and needs another few iterations in the literature (and in their last paragraph they almost save themselves, "the maxim “no one understands quantum mechanics” is a little less true than it used to be, at least in a practical sense", so they almost know this is a "clouds on the horizon" editorial), but they do not know the recent literature on QM well enough —and not just mine— to write this editorial.​

At the same time, I wrote to naturephysics@nature.com with a similar degree of intemperateness because I didn't care whether I got a reply (and it didn't), which I followed up with an e-mail to one person on the editorial team in particular, Bart Verberck, because I suspected, with no real evidence, that it was his hand at the first writing of the editorial. Also no reply. I followed up that second e-mail a few days ago, because why not, which, astonishingly to me, elicited a brief reply from the chief editor of Nature Physics, yesterday. The last sentence of that reply is of general interest, I think, and to the credit of the journal,
"But I can re-emphasize that the spirit of the editorial – that quantum foundations is an important area of study – is something that we believe rather strongly and we certainly hope to represent it in our pages in the future."​
The first short paragraph of the reply was a polite, rather nicely phrased setting aside of my work. I had forced them to respond to persuade me to go away —which they didn't need to do because there are other places where I can annoy people more productively, so I wasn't going to write a fourth time— but they didn't have time to read my rather obtuse work with enough care to figure out whether it contains something worthwhile.
The social aspects of this are somewhat incomprehensible to me: if there is something transformative in my work (or, to be clear, in someone else's) that really hasn't been said in either the ancient or recent literature on QM, what does it take to persuade people to read it? Three published articles in recent years in Physica Scripta, Annals of Physics, and Journal of Physics A are not enough, but what is? I don't want to over-claim, because I know that my work is less than perfectly clear and in any case I may just be as wrong as the worst crank, but I can see how my work fits into so many threads in the physics literature that I feel confident there is some good in my work, even though I must be wrong about many details (I always have been in the past and it's usually taken me a few years to see why I've been wrong about even tiny details.) My work is research in the raw about something that has resisted much smarter people than I am for most of a Century, so for even the smallest of good to come from my work is unbelievable enough that I can't find it in myself to fault Nature Physics. Fun, yeah.
 
  • #80
Peter Morgan said:
May I emphasize, @bhobba, that it's A quantum state that "just allows the calculation of probabilities", not "The quantum state"? In practice, we use a different Hilbert space for every different kind of experiment, not just a different state as a model for a given state preparation. It looks from your writing here that you might be open to that difference, but to me there's a pragmatic emphasis in saying "A" for which I think it's worth taking the trouble. If you think this is a difference you think is irrelevant, I'd like to know what your reasons are because I consider this to be fundamental to my choice of title for my article "The collapse of a quantum state as a joint probability construction" (no emphasis in the original, though I'd like there to have been). The question of the Heisenberg cut, for example, can be thought to be about what Hilbert space we "ought" to use to describe an experiment, and hence what quantum states and measurement operators are to be considered candidate models for our apparatus, with no evidently correct answer except for tractability.
Good and rarely expressed point! You caught my interest, and I symphatise with some interesting fragments in your your paper from post 63, so I will try to read it. I might not be aware of your previous papers although your name seems familiar, but I might mixed it up with a math teacher i had in an ODE class, he worked on mathematical physics as well but it must be someone else... The non-commutative way to codign things is a key, but i will read your paper before rambling. If I have any comments, i might start a separate thread.

/Fredrik
 
  • #81
Peter Morgan said:
Concerning the original post, "Nature Physics on quantum foundations", I was annoyed enough by that editorial that I wrote to nature physics that I felt they don't understand the foundations of QM literature at all if they could begin with "Quantum physics is weird".

Peter Shor:

“Quantum mechanics is really strange, and I don’t think there’s ever going to be any easy way of understanding it"

...
 
  • Haha
  • Skeptical
Likes bhobba and Peter Morgan
  • #82
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
 
  • Like
  • Love
Likes Fra, bhobba and Peter Morgan
  • #83
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
There is something more to it though. Relativity is also different to our intuition, but far less physicists complain about it or work on the foundations/interpretations of it.
 
  • #84
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it. The main quibble of QT for the older generation of physicists was that one has to give up determinism, while a change of the space-time model seemed not that revolutionary for physicists. On the other hand for some philosophers relativity (particularly the new notion of time, relativity of simultaneity, etc.) was as inacceptable as the indeterminism of for some of the physicists. That's the reason, why Einstein explicitly did not get the Nobel prize for relativity, because for the Nobel committee the quibbles of some philosophers with the relativistic notion of time (particularly Bergson) were too severe to give a Nobel prize for such a "weird theory of time". Rather they quoted Einstein's work on the photoelectric effect, i.e., "old quantum theory" as the prize-worthy work, which is ironic since this is the only one of the famous three topics of Einstein's miracle year 1905, which is not up to date anymore (not to mention that Einstein's greatest work for sure is his general relativity theory).
 
  • Like
Likes bhobba, Peter Morgan and Lord Jestocost
  • #85
vanhees71 said:
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it.

It is worth quoting Misner et al. on the ultimate breakdown of spacetime (in “Gravitation” by Charles W. Misner, Kip S. Thorne, John Archibald Wheeler, 1973 pp. 1182–1183):

The uncertainty principle thus deprives one of any way whatsoever to predict, or even to give meaning to, 'the deterministic classical history of space evolving in time.' No prediction of spacetime, therefore no meaning for spacetime, is the verdict of the quantum principle. That object which is central to all of classical general relativity, the four-dimensional spacetime geometry, simply does not exist, except in a classical approximation.
 
  • #86
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.
If we have CM+ in hand as well as QM and understand the relationship between QM and CM+ better than we understand the relationship between QM and CM, I hope that reduces the distance between QM/CM+ and GR enough to make a difference.
I think we also have to understand interacting QFTs better for us to be able to make progress with GR, which I address in a so far unpublished paper on arXiv, https://arxiv.org/abs/2109.04412, which currently has the title "A source fragmentation approach to interacting quantum field theory". I'm considering changing the title to "Interacting quantum fields as an inverse problem", however, because instead of modeling interactions as happening everywhere between point-like measurements, we encode the effects of the space between measurements by preprocessing the test functions that describe how the different measurements are "focused", because measurements are never at a point, so that we can then use those preprocessed "fragments" with non-interacting fields between the measurements to compute the n-measurement "unfocused" Wightman functions. All of which is possible because of the Reeh-Schlieder theorem and an analysis of renormalization that I have not seen elsewhere.
vanhees71 said:
I think the reason is that classical (i.e., non-quantum) physics can be formulated relativistically too, and there is no indeterminism in it. The main quibble of QT for the older generation of physicists was that one has to give up determinism, while a change of the space-time model seemed not that revolutionary for physicists.
My view has become that CM is only deterministic if almost all degrees of freedom are included in a model. That seems to me never to be the case, all the more so if the dynamics between the included and excluded degrees of freedom is chaotic. Consequently, I think we have almost no choice but to work with a statistical state and with measurement devices that are either included or excluded in the model.
As soon as we work with statistics and probabilities, Boole already knew in 1854 that sometimes a pair of relative frequencies do not admit a joint relative frequency that has that pair as marginals. With the benefit of hindsight, we can call such a pair of relative frequencies "incompatible", which can be understood to be the mathematics that underlies noncommutativity in QM, but 70 years earlier than QM. To cut a long story short, there are practical benefits to working with incompatible relative frequencies using probability measures, characteristic functions, and Hilbert space methods.
Philosophically, I feel that we can never know that we have included all degrees of freedom, even if we have, because we can never be sure that there isn't something happening at smaller sales than we have so far investigated. This is what seems to me an unanswerable question: "Are there turtles all the way down?" If there are different kinds of turtles at every scale and all the way down, then our dynamical models will never include all of the details and we run into some severe mathematical problems because we have to introduce the axiom of choice to fix the initial conditions and there's no guarantee that anything will be continuous or differentiable, pretty much wiping out predictability for classically understandable reasons. That doesn't have to stop us, but if that is the way the world is then I think we have to work with models that are constructed top-down. That seems to me to put us in the realm of thermodynamics more than of statistical mechanics.
 
  • Like
Likes vanhees71 and gentzen
  • #87
vanhees71 said:
It's only strange if you don't accept that Nature behaves differently to our intuition, trained by being immersed in a "classical macroscopic world". In fact it's a very successful description of all (known) matter from the elementary building blocks (as far as we know them), quarks and leptons, gauge bosons, and the Higgs boson, via hadrons, atomic nuclei, atoms, molecules, and condensed matter.

The big enigma is, of course, a satisfactory quantum description of the gravitational interaction.

Better to say that we still don't understand nature, and QM is just an approximation, and maybe it can be superseded by another broader and deeper theory or model..
 
Last edited:
  • #88
physika said:
Better to say that we still don't understand nature, and QM is just an approximation, and maybe it can be superseded by another broader and deeper theory or model.

In that sense, we do not understand anything. They are just models valid in a certain context. Classical Mechanics is a limiting case of QM. QM likely will be supplanted by something more general eventually - but nobody knows. It doesn't matter what replaces it will face the same issue. It's the nature of science, as Feynman often pointed out. If we find a theory of everything, that would be great, but if it is like peeling layers of an onion, that is just as interesting and equally as great. I believe there is some profound symmetry principle lying at the foundation of the world - but that is just a belief - nature is as nature is. The enterprise of science is finding out more about it.

Thanks
Bill
 
  • Like
Likes vanhees71 and Lord Jestocost
  • #89
Peter Morgan said:
If we have CM+ in hand as well as QM and understand the relationship between QM and CM+ better than we understand the relationship between QM and CM, I hope that reduces the distance between QM/CM+ and GR enough to make a difference.
I think we also have to understand interacting QFTs better for us to be able to make progress with GR, which I address in a so far unpublished paper on arXiv, https://arxiv.org/abs/2109.04412, which currently has the title "A source fragmentation approach to interacting quantum field theory". I'm considering changing the title to "Interacting quantum fields as an inverse problem", however, because instead of modeling interactions as happening everywhere between point-like measurements, we encode the effects of the space between measurements by preprocessing the test functions that describe how the different measurements are "focused", because measurements are never at a point, so that we can then use those preprocessed "fragments" with non-interacting fields between the measurements to compute the n-measurement "unfocused" Wightman functions. All of which is possible because of the Reeh-Schlieder theorem and an analysis of renormalization that I have not seen elsewhere.

My view has become that CM is only deterministic if almost all degrees of freedom are included in a model. That seems to me never to be the case, all the more so if the dynamics between the included and excluded degrees of freedom is chaotic. Consequently, I think we have almost no choice but to work with a statistical state and with measurement devices that are either included or excluded in the model.
As soon as we work with statistics and probabilities, Boole already knew in 1854 that sometimes a pair of relative frequencies do not admit a joint relative frequency that has that pair as marginals. With the benefit of hindsight, we can call such a pair of relative frequencies "incompatible", which can be understood to be the mathematics that underlies noncommutativity in QM, but 70 years earlier than QM. To cut a long story short, there are practical benefits to working with incompatible relative frequencies using probability measures, characteristic functions, and Hilbert space methods.
Philosophically, I feel that we can never know that we have included all degrees of freedom, even if we have, because we can never be sure that there isn't something happening at smaller sales than we have so far investigated. This is what seems to me an unanswerable question: "Are there turtles all the way down?" If there are different kinds of turtles at every scale and all the way down, then our dynamical models will never include all of the details and we run into some severe mathematical problems because we have to introduce the axiom of choice to fix the initial conditions and there's no guarantee that anything will be continuous or differentiable, pretty much wiping out predictability for classically understandable reasons. That doesn't have to stop us, but if that is the way the world is then I think we have to work with models that are constructed top-down. That seems to me to put us in the realm of thermodynamics more than of statistical mechanics.
I'm not so sure that an alternative or extended version of classical point-particle mechanics helps to get an idea of how to formulate a "quantum spacetime" or "quantum gravity". For that one needs a field theory. Point particles are ill-defined already in classical relativistic physics, as manifests itself in the notorious radiation-reaction problem for charged point particles in electromagnetic theory and the non-existence of interacting many-body point-particle systems.
 
  • Like
Likes bhobba and Peter Morgan
  • #90
vanhees71 said:
I'm not so sure that an alternative or extended version of classical point-particle mechanics helps to get an idea of how to formulate a "quantum spacetime" or "quantum gravity". For that one needs a field theory. Point particles are ill-defined already in classical relativistic physics, as manifests itself in the notorious radiation-reaction problem for charged point particles in electromagnetic theory and the non-existence of interacting many-body point-particle systems.
I'm not sure, either. As I say in the comment you quote from, it's more a hope. As far as I know, the steps I've taken to rethink the measurement problem are not a commonplace, (though there are a few precedents in Belavkin and Tsang&Caves, and I think there are strands in the literature that have been working towards the steps I have taken, particularly in work on Koopman's Hilbert space formalism for classical mechanics), and the steps I've taken to rethink the renormalization "problem" (yeah, I know many physicists think the RG means there isn't a problem) are completely new.

As always, any step taken can turn out to be less worthwhile than its enthusiasts think it is, but any step taken can also allow other steps, either next week or in 50 years, after other developments. Koopman's work on Hilbert space formalism for CM dates from 1931 and had almost zero effect for decades, except that von Neumann and Birkhoff proved two different versions of the ergodic theorem using Koopman's idea, but, for me at least, reading his work through modern literature on quantum measurement theory, noncommutative probability, quantum field theory, et cetera, gives a perspective that is transformative. A few people find my telling of this compelling, but most physicists are not committing themselves. If this in fact transforms our ideas about physics over the next few years, then hesitating for just the right amount of time is rational, but hesitating too long will be a bad idea. If it doesn't, then hesitating was the right thing to do. Time will tell.

As to the field theory, I'm all about that. I don't do it well, but that's where I come from and where I'm going. The CM+ and measurement problem stuff came out of my long-term thinking about the relationship between QFT and random fields. You can see my "Bell inequalities for random fields" in JPhysA 2006, https://arxiv.org/abs/cond-mat/0403692 (DOI there), for example, which is little cited but I have still not seen a discussion of Bell's ideas about beables that better folds in QFT and classical noisy fields (for the latter, correlations at space-like separation are the norm at or near equilibrium, which invalidates Bell's reasoning in his "The theory of local beables" without having to introduce superdeterminism, constrained free will, and all that; I can only understand that this isn't a commonplace by now because everybody is so fixated on how photons are flying around their apparatuses, which seems like proof that field theory is not as much present as I think it should be in most people's thinking.)

I think your focus on the radiation-reaction problem is well-taken. When faced with an ill-defined mathematical model, a standard move is to use a dual construction instead. That's what I think the Wightman axioms do by working with "test functions" instead of with operator-valued distributions. I move further in that direction by working with an "inverse problem" approach. As I say above, I think the "fragmentation" language is OK but not good enough, but I often find that that kind of tiny shift of perspective can take me a year or two to find and get used to.
 
  • Like
Likes bhobba and vanhees71
  • #91
gentzen said:
I will elaborate it for Bosons, so that I can ignore the wavefunction for the equivalence relation.
Even for Fermions, I can just ignore the wavefunction for the equivalence relation. I realized this when the skeptic in me started to ponder over a "serious omission" in the given quotient construction:

If we interpret the trajectories as a function ##(x_1, \ldots, x_n)(t) : \mathbb R \to \mathbb R^{3n}## and consider "piecewise constant" permutations ##\pi(t):\mathbb R \to S_n##, then ##(x_{\pi(t)(1)}, \ldots, x_{\pi(t)(n)})(t)## is only "piecewise continuous". So it is not a "strong" solution of the guiding equation. Weakening the continuity requirements is possible (and needed, because otherwise uniqueness of solution together with continuity allows identification of particles between different times), but it feels very much like a "patchwork construction".

Turns out my attempt to illustrate the character of "unnatural" constructions as "patchwork constructions" in the initial reply to A. Neumaier failed to identify the crucial points. Glueing together the endpoints of a closed interval to get a circle is an "unnatural" construction, but taking an open interval and identifying two small open intervals at both ends (pointwise) with each other to get a circle is a not an "unnatural" construction, despite allowing patchwork. (And the "quotient by a discrete subgroup" fails even worse to identify the crucial points, "discrete" is neither necessary nor sufficient, and "subgroup" instead of "group" was superfluous.)

I guess the point of "natural quotient" constructions is rather that at least locally, the real work should already be finished before taking the quotient, so that it doesn't make a difference for "local topological" constructions whether they are applied before or after taking the quotient.

vanhees71 said:
Of course, it's much more efficient to simply not use any kind of trajectories as in the dBB interpretation. They do not provide anything physical to QT anyway. You may solve some philosophical quibble but introduce more complication without gaining any new insights from a scientific point of view.
Of course, it is completely unclear what "much more efficient" is supposed to mean in this context. The meaning of "anything physical" is clearer, but for me it is enough that dBB and quotient constructions provide "something mathematical".

The relationship between "mathematical constructions" and "philosophical quibbles" has always been a complicated one. Legend says that the Pythagoreans killed the one who discovered irrational numbers. And Zeno of Elea attacked the continuum on philosophical grounds. In both cases, the attacked concepts turned out to have hidden complexities and dangers, but the attacks themselves failed to clearly isolate those or show a way forward. Maybe philosophers are better at articulating hidden problems, than at helping to overcome them.

At least for me, reading philosophical texts is sometimes both fun and useful. SEP comes to mind, and also:
gentzen said:
..., then I read An Interpretive Introduction to Quantum Field Theory cover to cover. It was easy to read, with a good mix of technical details, explanations, interpretations, and philosophical clarification.
… much of the interpretive work Teller undertakes is to understand the relationship and possible differences between quantum field-theory — i.e., QFT as quantization of classical fields — and quantum-field theory — i.e., a field theory of ‘quanta’ which lack radical individuation, or as Teller says, “primitive thisness.”
Teller made quite some effort to help his reader grasp how radically different truly "indistinguishable particles" are compared to our everyday experience. Looking at them in dBB on the other hand shows you how they require (anti-)symmetric wavefunctions and some form of discontinuity. As always, the discontinuity required in dBB is "too nonlocal" compared to what you actually need. And of course, lessons from dBB apply primarily to non-relativistic QM.
 
  • #92
martinbn said:
There is something more to it though. Relativity is also different to our intuition, but far less physicists complain about it or work on the foundations/interpretations of it.
Here is why we don't have as much work on foundations/interpretations of relativity theory as we do on quantum mechanics per Zeilinger:
Physics in the 20th century is signified by the invention of the theories of special and general relativity and of quantum theory. Of these, both the special and the general theory of relativity are based on firm foundational principles, while quantum mechanics lacks such a principle to this day. By such a principle, I do not mean an axiomatic formalization of the mathematical foundations of quantum mechanics, but a foundational conceptual principle. In the case of the special theory, it is the Principle of Relativity, ... . In the case of the theory of general relativity, we have the Principle of Equivalences ... . Both foundational principles are very simple and intuitively clear. ...
I submit that it is because of the very existence of these fundamental principles and their general acceptance in the physics community that, at present, we do not have a significant debate on the interpretation of the theories of relativity. Indeed, the implications of relativity theory for our basic notions of space and time are broadly accepted.
 
  • Like
  • Love
Likes Lord Crc, bhobba, DrChinese and 3 others
  • #93
RUTA said:
Here is why we don't have as much work on foundations/interpretations of relativity theory as we do on quantum mechanics per Zeilinger:
I agree about the situation. But do we agree on the conclusion from it?

I think for example "reason" for WHY there is a observer invariant upper limit on communication speed is an important one to answer. Taking it as an axiom or emprical observation is I think not satisfactory. When deriving SR from axioms details of EM field or light has no distinguished role. Just the existence of an invariant common max speed(in 3D space) is enough. What is it with the construction or emergence of space that explains this? If we can answer that (and i think we shouldn try) then i think we will get many clues towards unifying GR and QM.

But i still agree its a value to (like you try to) to point to potential principal explanations of qm as well. But it does not give me peace of mind.

/Fredrik
 
  • #94
After all space is just an "index" of events that gives them relational structure. How is this index built and defined from the observer that distinguishes the events. If we start bt thinking of "classical pointers" don't we already with the word "classical" imply not only "macroscopic" but also the embedding in classical 3D space? How can one imagine "classical observer" prior to spacetime?

/Fredrik
 
  • #95
Fra said:
I agree about the situation. But do we agree on the conclusion from it?

I think for example "reason" for WHY there is a observer invariant upper limit on communication speed is an important one to answer. Taking it as an axiom or emprical observation is I think not satisfactory. When deriving SR from axioms details of EM field or light has no distinguished role. Just the existence of an invariant common max speed(in 3D space) is enough. What is it with the construction or emergence of space that explains this? If we can answer that (and i think we shouldn try) then i think we will get many clues towards unifying GR and QM.

But i still agree its a value to (like you try to) to point to potential principal explanations of qm as well. But it does not give me peace of mind.

/Fredrik
In principle explanation, the empirically-discovered fact is not fundamental, it must be justified by a compelling fundamental principle. For example, in special relativity we explain length contraction as follows:

Relativity principle ##\rightarrow## Justifies light postulate ##\rightarrow## Dictating length contraction.

So, the "invariant upper limit on communication speed" (light postulate) is not the fundamental explanans in special relativity, the relativity principle is. As Norton notes (https://sites.pitt.edu/~jdnorton/papers/companion.pdf):
Until this electrodynamics emerged, special relativity could not arise; once it had emerged, special relativity could not be stopped.
Wouldn't it be nice if we had a principle explanation of entanglement that was just as compelling?
 
Last edited:
  • Like
Likes bhobba and vanhees71
  • #96
From the relativity principle (i.e., the equivalence of all inertial frames and the assumption of their existence) alone + further symmetry assumptions (homogeneity of time, euclidicity of space for all inertial observers) you can derive that there are two spacetime models: Galilei-Newton spacetime (with the Galilei group as its symmetry group) and Einstein-Minkowski spacetime (with the Poincare group as its symmetry group). To decide, which one describes nature is empirical, i.e., you cannot separate fundamentals from empirics. All fundamental laws must be grounded in solid empirical evidence.

I don't see, where there should be a general difference between the foundations of quantum theory and the description of space and time. The only difference is the lack of determinism in quantum theory and the possibility of classical, deterministic physics within the spacetime models, which have been successful to build the foundation of both classical and quantum physics.

The reluctance to accept "irreducible randomness" in the (observed!) behavior of Nature seems to me to be just a psychological phenomenon without any objective empirical foundation. At least there's only emprical evidence for this irreducible randomness, and none against it.
 
  • #97
vanhees71 said:
From the relativity principle (i.e., the equivalence of all inertial frames and the assumption of their existence) alone + further symmetry assumptions (homogeneity of time, euclidicity of space for all inertial observers) you can derive that there are two spacetime models: Galilei-Newton spacetime (with the Galilei group as its symmetry group) and Einstein-Minkowski spacetime (with the Poincare group as its symmetry group). To decide, which one describes nature is empirical, i.e., you cannot separate fundamentals from empirics. All fundamental laws must be grounded in solid empirical evidence.

I don't see, where there should be a general difference between the foundations of quantum theory and the description of space and time. The only difference is the lack of determinism in quantum theory and the possibility of classical, deterministic physics within the spacetime models, which have been successful to build the foundation of both classical and quantum physics.

The reluctance to accept "irreducible randomness" in the (observed!) behavior of Nature seems to me to be just a psychological phenomenon without any objective empirical foundation. At least there's only emprical evidence for this irreducible randomness, and none against it.
So what is the fundamental principle of QM that is similar to the invariance of the speed ot light?
 
  • #98
It's the notion of quantum states represented by statistical operators on a Hilbert space with their probabilistic interpretation.
 
  • #99
martinbn said:
So what is the fundamental principle of QM that is similar to the invariance of the speed ot light?
There are different ways to render a principle account of QM. Bohr did it using the quantum postulate (discontinuous jumps between stationary states) and the correspondence principle (quantum transitions correspond to harmonics of classical motion). [Heisenberg used these to generate his matrix formulation of QM.] More recently (starting in 90's) we have information-theoretic reconstructions of QM, e.g., Rovelli based his on non-commutativity, Bub based his on non-Boolean algebraic structure, and Hardy based his on continuity of reversible transformations between pure states for the qubit. That last one leads to Brukner and Zeilinger's fundamental principle of Information Invariance & Continuity which is the equivalent of the light postulate for SR. In information-theoretic form it's not as transparent as the light postulate, but when instantiated physically it means the measurement of Planck's constant is invariant between inertial reference frames related by spatial rotations or translations. So, the invariant measurement of the speed of light between inertial reference frames related by boosts leads to SR and the invariant measurement of Planck's constant between inertial reference frames related by spatial rotations and translations leads to QM. The relativity principle is the compelling fundamental principle justifying these empirically-discovered facts in both cases.
 
  • #100
Of course, QT must be compatible with the spacetime model you use and that's why you come to unitary ray representations of the Galilei or Poincare group for the Newtonian and special-relativistic spacetime model, and ##\hbar## is a scalar parameter, because it's just an arbitrary conversion factor to define the SI units.
 
Back
Top