I Quantum mechanics is not weird, unless presented as such

  • #351
As you know, the thermodynamic limit has for some cases shown noncomputability of the gap in quantum many-body theory though, which is even worse than nondeterminism, so it's a double-edged sword.
 
Physics news on Phys.org
  • #352
stevendaryl said:
If you treat Brownian motion using statistical mechanics, then it's deterministic. If you analyze a dust particle suspended in a liquid, your statistical mechanics will give a probability distribution for the location of the particle as a function of time
That makes it nondeterministic. Once probabilities are the basic quantities, one has a stochastic system. Note that in any classical stochastic system, probabilities have a deterministic dynamics, but they nevertheless describe stochastic, nondeterministic processes.

To go from the probabilities to the actual events is the classical version of collapse; cf. the companion thread. But nobody working on stochastic processes uses that weird language for it.

On the other hand, for a system in equilibrium (which involves a thermodynamic limit), quantum statistical mechanics produces the deterministic equations of equilibrium thermodynamics, where no trace is left of anything probabilistic or stochastic. This is quite unlike Brownian motion, which is about the interaction of a macroscopic fluid and a microscopic 1-particle system, restricted to the microscopic system. Stochasticity characterizes the microscopic world, but is foreign to much of the macroscopic world - even when the latter is described as a quantum system.
 
  • #353
ddd123 said:
shown noncomputability [...] which is even worse than nondeterminism
?

We already cannot compute most things about most classical systems with more than a few degrees of freedom, thus the whole discussion about theoretical limits of computability is moot.
 
  • #354
In quantum theory the probabilities are also deterministic in the sense that the statistical operator and the operators representing observables follow deterministic equations of motion. That doesn't make quantum theory a deterministic theory in the usually understood sense. Determinism means that, as within classical physics, all observables at each time have a determined value and these values change via an equation of motion which let's you know any value at any time ##t>t_0##, if you know these values at a time ##t_0##.
 
  • Like
Likes A. Neumaier
  • #355
A. Neumaier claims that quantum mechanics has no weirdness, despite demonstrations that objects as small as photons can share properties over more than a kilometer in Bell theorem tests. This sort of fuzzyheaded thinking has led to a "mass boson" called the Higgs which is so massive it cannot exist for a fraction of a second, despite the evidence the Universe has existed for 13 billion years. So the physicists "cook the books" with "virtual particles", and where the claims of "magic" cannot be refuted (as in entanglement), they simply demand it be accepted without explanation. No mechansim, nothing to see here, move along now.

Quantum mechanics isn't weird, but the explanations we have historically accepted are wrong. We will discover better ones.
 
  • #356
A. Neumaier said:
That makes it nondeterministic. Once probabilities are the basic quantities, one has a stochastic system. Note that in any classical stochastic system, probabilities have a deterministic dynamics, but they nevertheless describe stochastic, nondeterministic processes.

Then I misunderstand what you mean about the thermodynamic limit of QFT being deterministic.

To go from the probabilities to the actual events is the classical version of collapse; cf. the companion thread. But nobody working on stochastic processes uses that weird language for it.

That's because it's pretty clear what the relationship is between the actual events and the statistical model: The actual case is one element of an ensemble of cases with the same macroscopic description. The collapse is just a matter of updating knowledge about which case we are in.

On the other hand, for a system in equilibrium (which involves a thermodynamic limit), quantum statistical mechanics produces the deterministic equations of equilibrium thermodynamics, where no trace is left of anything probabilistic or stochastic.

I wouldn't say that. Equilibrium thermodynamics can be interpreted probabilistically: the actual system has a probability of e^{- \beta E_j}/Z of being in state j, where E_j is the energy of state j, and \beta = \frac{1}{kT}, and Z is the partition function. (Something more complicated has to be done to take into account continuum-many states in classical thermodynamics...)

You can use the equilibrium thermodynamics to compute distributions on particle velocities, and thus to analyze the stochastic behavior of a dust particle suspended in a fluid.
 
  • #357
C Davidson said:
A. Neumaier claims that quantum mechanics has no weirdness, despite demonstrations that objects as small as photons can share properties over more than a kilometer in Bell theorem tests. This sort of fuzzyheaded thinking has led to a "mass boson" called the Higgs which is so massive it cannot exist for a fraction of a second, despite the evidence the Universe has existed for 13 billion years. So the physicists "cook the books" with "virtual particles", and where the claims of "magic" cannot be refuted (as in entanglement), they simply demand it be accepted without explanation. No mechansim, nothing to see here, move along now.

Quantum mechanics isn't weird, but the explanations we have historically accepted are wrong. We will discover better ones.

I've been one of the ones arguing on the side of QM being weird (or at least, nonlocal), but the stuff that you're saying about the Higgs isn't really relevant to these foundational issues. There is a distinction between the Higgs "field" and the Higgs "particle". The particle is fluctuations in the field, and those fluctuations might be short-lived. But the field itself is stable over billions of years (if not forever---it may not be forever).

Anyway, I think it's important to distinguish between two different kinds of weirdness:
  1. A topic can seem baffling and weird to a novice, because it involves unfamiliar concepts, or because familiar concepts no longer apply. This is a matter of learning the subject thoroughly. Special Relativity seems bizarre to those first exposed to it, but after you become familiar with it, and understand it, much (all?) of the weirdness disappears.
  2. There can be lingering questions about the foundations of a topic, even after someone has thoroughly mastered the topic.
A. Neumaier is claiming that the only weirdness of QM is of type 1: If you understand it in the right way, then it stops being weird. I claim that there is some type 2 weirdness.

There might be unanswered foundational questions about the Higgs or the use of virtual particles in calculations, but I don't think so. I think that the weirdness there is due to lack of understanding of the (very complicated) subject. I think you're talking about type 1 weirdness.
 
  • #358
vanhees71 said:
In quantum theory the probabilities are also deterministic in the sense that the statistical operator and the operators representing observables follow deterministic equations of motion. That doesn't make quantum theory a deterministic theory in the usually understood sense. Determinism means that, as within classical physics, all observables at each time have a determined value and these values change via an equation of motion which let's you know any value at any time ##t>t_0##, if you know these values at a time ##t_0##.

So in what sense is the thermodynamic limit of QFT deterministic?
 
  • #359
stevendaryl said:
That's because it's pretty clear what the relationship is between the actual events and the statistical model: The actual case is one element of an ensemble of cases with the same macroscopic description. The collapse is just a matter of updating knowledge about which case we are in.
Yes, and in the quantum case it is the same, if you drop the word ''macroscopic''.
stevendaryl said:
I wouldn't say that. Equilibrium thermodynamics can be interpreted probabilistically: the actual system has a probability of e^{- \beta E_j}/Z of being in state j, where E_j is the energy of state j, and \beta = \frac{1}{kT}, and Z is the partition function. (Something more complicated has to be done to take into account continuum-many states in classical thermodynamics...)
Equilibrium thermodynamics doesn't have the concept of a partition function. One needs statistical mechanics to relate the former to a probabilistic view of matter.
stevendaryl said:
You can use the equilibrium thermodynamics to compute distributions on particle velocities, and thus to analyze the stochastic behavior of a dust particle suspended in a fluid.
You can use statistical mechanics to do that, but not equilibrium thermodynamics, which is a 19th century classical theory that doesn't have a notion of particles. Statistical mechanics is much more versatile than thermodynamics, as one isn't limited to locally homogeneous substances.
 
  • #360
A. Neumaier said:
Yes, and in the quantum case it is the same, if you drop the word ''macroscopic''.

But that sounds like a hidden-variables theory of the type that is supposed to not exist.

Equilibrium thermodynamics doesn't have the concept of a partition function. One needs statistical mechanics to relate the former to a probabilistic view of matter.

Okay. I'm lumping thermodynamics and statistical mechanics together.
 
  • #361
stevendaryl said:
So in what sense is the thermodynamic limit of QFT deterministic?
In the sense that it results in 19th century classical thermodynamics. In the latter theory there are known, exact, nonrandom relations between the thermodynamic quantities, and one can predict (from a thermodynamic potential and the values of a few state variables) the results of all reversible changes with certainty. No thermodynamics textbook mentions randomness (unless it refers to an underlying microscopic picture, i.e., to statistical mechanics).
 
Last edited:
  • #362
stevendaryl said:
that sounds like a hidden-variables theory of the type that is supposed to not exist.
Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.
 
  • #363
A. Neumaier said:
Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.

I had to leave the discussion for a while, because I was overly busy with my paying job, so I may have missed something, but it seems to me that taking into account the environment can't possibly resolve the nondeterminism using only unitary evolution. My argument is pretty simple:

Let |\psi_U\rangle be a state (including an electron, a stern-gerlach device, and the environment) which leads to measurement outcome "spin-up" for a spin measurement. Let |\psi_D\rangle be a state which leads to measurement outcome "spin-down". Then the state |\psi_?\rangle = \alpha |\psi_U\rangle + \beta |\psi_D\rangle would be a state that would lead to an undetermined outcome to the measurement. Maybe you can argue that there is no way to produce state |\psi_?\rangle, but it certainly exists in the Hilbert space, and it's not at all obvious to me that it would be unachievable.
 
  • #364
stevendaryl said:
it seems to me that taking into account the environment can't possibly resolve the nondeterminism using only unitary evolution. My argument is pretty simple:

Let |\psi_U\rangle be a state (including an electron, a stern-gerlach device, and the environment) which leads to measurement outcome "spin-up" for a spin measurement. Let |\psi_D\rangle be a state which leads to measurement outcome "spin-down". Then the state |\psi_?\rangle = \alpha |\psi_U\rangle + \beta |\psi_D\rangle would be a state that would lead to an undetermined outcome to the measurement. Maybe you can argue that there is no way to produce state |\psi_?\rangle, but it certainly exists in the Hilbert space, and it's not at all obvious to me that it would be unachievable.
This is a well-known argument, used already long ago by Wigner, I believe.

But it is not valid in my setting: Here, in the algebra of linear operators of some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are described by the projections of this universal density matrix to the algebra of linear operators of the tiny Hilbert space describing the observable system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.
 
Last edited:
  • #365
A. Neumaier said:
This is a well-known argument, used already long ago by Wigner, I believe.

But it is not valid in my setting, where, in some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are projections of this universal density matrix to the tiny Hilbert space describing the microscopic system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.

I think I understand your point, but it still seems like a tremendous leap. The same argument I made earlier can be lifted up to the level of universal density matrix, I would think. Why does the universal density matrix necessarily lead to definite outcomes to all possible experiments? Is there a way to prove this for a typical density matrix, or are your assuming some kind of "fine-tuning" of the initial density matrix to insure that it's true?

Mathematically, I think what you're saying might be something along the lines of the following:

Let \rho be the density matrix of the universe at some time (let's pick a frame/coordinate system so that we can talk about the state at one time). Then the claim might be that there is a decomposition of \rho into the form \rho = \sum_j p_j |\psi_j\rangle \langle \psi_j | where \psi_j is an orthonormal basis such that for each j, all macroscopic quantities (such as the outcomes of measurements) have definite values. I don't see why that should be the case.

(You can always write \rho = \sum_j p_j |\psi_j\rangle \langle \psi_j | , but you can't always be guaranteed that your favorite set of observables---the macroscopic values of measurement results--will be diagonal in the basis \psi_j)
 
  • #366
stevendaryl said:
Why does the universal density matrix necessarily lead to definite outcomes to all possible experiments? Is there a way to prove this for a typical density matrix, or are your assuming some kind of "fine-tuning" of the initial density matrix to insure that it's true?
I only need to assume that the observed part of the universe is approximately in local equilibrium. This is amply corroborated by experiment, and provides a very strong constraint on the universal density matrix. Indeed, local equilibrium is just the assumption needed to derive fluid mechanics or elasticity theory from quantum field theory, and for more than a century we describe every macroscopic object in these terms. Thus only those density matrices qualify as typical that satisfy this experimental constraint.
In my book (see post #2 of this thread), I call the corresponding states Gibbs states.
stevendaryl said:
Let \rho be the density matrix of the universe at some time (let's pick a frame/coordinate system so that we can talk about the state at one time). Then the claim might be that there is a decomposition of \rho into the form \rho = \sum_j p_j |\psi_j\rangle \langle \psi_j | where \psi_j is an orthonormal basis such that for each j, all macroscopic quantities (such as the outcomes of measurements) have definite values. I don't see why that should be the case.
This is obviously not the case but this was not my claim. We do not need definite values but only values accurate enough to match experimental practice. This is a much less severe condition.

We all know from classical nonequilibrium thermodynamics that the macroscopic local observables are a small set of fields (in the simplest case just internal energy density and mass density). We also know from statistical mechanics in the grand canonical ensemble that these are given microscopically not by eigenvalues but by certain well-defined expectations. Under the assumption of local equilibrium, the fluctuations of the corresponding averaged quantum fields around the expectations are negligible. Thus the values of the macroscopic effective fields (obtained by corresponding small-scale averaging in the statistical coarse-graining procedure) are sharp for all practical purposes.

Mathematically, this becomes exact only in the thermodynamic limit. But for observable systems, which have finite extent, one can estimate the uncertainties through the standard fluctuation formulas of statistical mechanics. One finds that for macroscopic observations at the human length and time scale, we typically get engineering accuracy. This is the reason why engineering was already successful long before the advent of quantum mechanics.
 
  • #367
A. Neumaier said:
I only need to assume that the observed part of the universe is approximately in local equilibrium. This is amply corroborated by experiment, and provides a very strong constraint on the universal density matrix. Indeed, local equilibrium is just the assumption needed to derive fluid mechanics or elasticity theory from quantum field theory, and for more than a century we describe every macroscopic object in these terms. Thus only those density matrices qualify as typical that satisfy this experimental constraint.
In my book (see post #2 of this thread), I call the corresponding states Gibbs states.

But to me, the question is about quantum theory, not empirical observations. Does QM predict those observations?
 
  • #368
stevendaryl said:
So in what sense is the thermodynamic limit of QFT deterministic?
Don't ask me. I don't understand this claim at all.
 
  • #369
A. Neumaier said:
This is obviously not the case but this was not my claim. We do not need definite values but only values accurate enough to match experimental practice. This is a much less severe condition.

I think that's just a clarification of what I mean by "macroscopic quantities". I like your suggestion of giving coarse-grained descriptions of the mass-energy density, and field values. If the description is coarse enough, then the uncertainty principle doesn't get in the way of knowing the "macroscopic state of the universe" to that level of accuracy.
 
  • Like
Likes vanhees71
  • #370
Precisely the apparently "deterministic" behavior of macroscopic systems is due to a "blurred" enough view on them. One way is to derive semiclassical transport models from QFT. The Kadanoff-Baym equations (fully quantum) become a Boltzmann equation in the quasiparticle limit applying leading-order gradient expansion.
 
  • #371
stevendaryl said:
I think that's just a clarification of what I mean by "macroscopic quantities". I like your suggestion of giving coarse-grained descriptions of the mass-energy density, and field values. If the description is coarse enough, then the uncertainty principle doesn't get in the way of knowing the "macroscopic state of the universe" to that level of accuracy.

The question is: Can the universe be in a superposition of states that have different macroscopic states? If not, why not?
 
  • #372
stevendaryl said:
But to me, the question is about quantum theory, not empirical observations. Does QM predict those observations?
Quantum theory is derived from empirical observations and organizes these into a coherent whole. Quantum field theory predicts - under the usual assumptions of statistical mechanics, which include local equilibrium - hydrodynamics and elasticity theory, and hence everything computable from it.

Of course it predicts only the general theoretical structure, since all the detail depends on the initial conditions. But it predicts in principle all material properties, and quantum chemists are doing precisley that. All these are essentially exact predictions of QFT, with errors dominated by the computational techniques available rather than the uncerainty due to the averaging. Together with prepared or observed initial conditions it predicts the values of the macroscopic observables at later times. For example, computational fluid dynamics is an essential tool for the optimization of modern aircrafts.

Local equilibrium itself is usually justified in an ad hoc way assuming fast relaxation scales. These can probably be derived, too, but I haven't seen a derivation. But one knows when this condition is not satisfied in practice - namely if the mean free path lenth is too long. This happens for very dilute gases, where the Boltzmann equation must be used instead of hydrodynamic equations (and can be derived from QFT).
 
  • #373
stevendaryl said:
The question is: Can the universe be in a superposition of states that have different macroscopic states? If not, why not?
In the view I outlined above, the universe is not in a pure state but in a Gibbs state where local equilibrium holds to a good approximation. This is not a pure state but a mixture, ##\rho=e^{-S/k}## where ##S## is an entropy operator and ##k## the Boltzmann constant.

The more precise one wants to describe the state of the universe, the more complex is the form of ##S##. Local equilibrium means that one considers the approximation where ##S## is an integral over local fields, and leads to hydrodynamics. The next, more accurate approximation is microlocal equilibrium, where
##S## is an integral over local fields, and leads to kinetic theory (Boltzmann equation and Kadanoff-Baym equations). Critical point studies go even selectively beyond that ot make predctions of critical exponents.
 
Last edited:
  • #374
A. Neumaier said:
Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.
Interesting how you imagine these "non-local hidden variables" and their effects... In particular, are they actual variables, i.e. do they get changed by some processes? I think this is critical for distinguishing from LHV models - because constant "variables", even if called "non-local" in some sense, can in my opinion always be modeled by local copies. Only their non-local change, or in other words spooky action at a distance, is what sets a model apart of LHV models and allows Bell violations.
 
  • #375
A. Neumaier said:
This is a well-known argument, used already long ago by Wigner, I believe.

But it is not valid in my setting: Here, in the algebra of linear operators of some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are described by the projections of this universal density matrix to the algebra of linear operators of the tiny Hilbert space describing the observable system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.

I'd like to have your expert opinion on the Conway-Kochen theorem. http://arxiv.org/pdf/quant-ph/0604079.pdf and http://arxiv.o rg/pdf/0807.3286.pdf
 
Last edited by a moderator:
  • #376
georgir said:
are they actual variables, i.e. do they get changed by some processes?
They change according to the Schroedinger equation of the universe, which determiens how ##\rho(t)## depends on time. The Hamiltonian would be known if we had a common generalization of the standard model and gravitation.
 
  • #377
Hornbein said:
I'd like to have your expert opinion on the Conway-Kochen theorem. http://arxiv.org/pdf/quant-ph/0604079.pdf and http://arxiv.o rg/pdf/0807.3286.pdf
I don't think the paper has any relevance. The will of the experimenter is not relevant for Bell-type experiments, as all choices can be made by automatic devices. (See https://www.physicsforums.com/posts/5347224/ , especially point 9.)

In particular, the assumption made in their theorem is highly unrealistic. The choices made by an automatic device always depend on its internal state and its input, hence are in some sense determined by the information available to the device.

There is also no reason to believe that things would be different with humans, although here the definition of ''free will'' is beset with philosophical difficulties.
 
Last edited by a moderator:
  • #378
A. Neumaier said:
I don't think the paper has any relevance. The will of the experimenter is not relevant for Bell-type experiments, as all choices can be made by automatic devices. (See https://www.physicsforums.com/posts/5347224/ , especially point 9.)

In particular, the assumption made in their theorem is highly unrealistic. The choices made by an automatic device always depend on its internal state and its input, hence are in some sense determined by the information available to the device.

There is also no reason to believe that things would be different with humans, although here the definition of ''free will'' is beset with philosophical difficulties.
Aha. So you are a superdeterminist, like t'Hooft? You are correct: the theorem does not exclude this possibility.
 
  • #380
I got a pingback on my blog from someone with a question/comment about my blog post concerning 'Wrong idea...' but I can't find the post and I don't know who asked the question. Please feel free to contact me through my blog (there's a 'contact me' option there) if you would like a reply. Thanks.
 
Last edited:
  • #381
rkastner said:
I got a pingback on my blog from someone with a question/comment about my blog post concerning 'Wrong idea...' but I can't find the post and I don't know who asked the question.
Off topic but, I wouldn't post emails on a public forum, it's inviting spam doomsday. Today's services are filtered but you increase it tenfold if not more. I may be wrong.
 
  • #382
ddd123 said:
Off topic but, I wouldn't post emails on a public forum, it's inviting spam doomsday. Today's services are filtered but you increase it tenfold if not more. I may be wrong.
Thanks, fixed it
 
  • #383
A. Neumaier said:
Here, in the algebra of linear operators of some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are described by the projections of this universal density matrix to the algebra of linear operators of the tiny Hilbert space describing the observable system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.
Further discussion of this part (concerning reality described by a universal density matrix), if any, please in this new thread!
 
  • #384
One offshoot of this discussion (and the twin discussion of an associated experimental setting) is that I arrived at a new, improved understanding of relativistic causality. This settles (for me) all problems with causality in Bell-type theorems, and reduces the weirdness of nonlocality experiments to a problem in the psychology of knowledge. The residual weirdness is only of the same kind as the weirdness of being able to know what happens if some object falls into a classical black hole and when it will hit the singularity, although no information can escape from a black hole.

Thus the quantum case is not really different from the classical case in this respect. This throws light on the true, social, role of weirdness in quantum mechanics.

People very experienced in a particular area of real life can easily trick those who don't understand the corresponding matter well enough into believing that seemingly impossible things can happen. This is true in the classical domain, amply documented by magic tricks where really weird things happen, such as rabbits being pulled out of empty hats, etc..

The art of a magician consists in studying particular potentially weird aspects of Nature and presenting them in a context that emphasizes the weirdness. Part of the art consists of remaining silent about the true reasons why things work rationally, since then the weirdness is gone, and with it the entertainment value.

The same is true in the quantum domain. Apart from being technically very versed experimental physicists, people like Anton Zeilinger are quantum magicians entertaining the world with well-prepared quantum weirdness. And the general public loves it! Judging by its social impact, quantum weirdness will therefore never go away as long as highly reputed scientists are willing to play this role.
 
  • #385
A. Neumaier said:
One offshoot of this discussion (and the twin discussion of an associated experimental setting) is that I arrived at a new, improved understanding of relativistic causality. This settles (for me) all problems with causality in Bell-type theorems, and reduces the weirdness of nonlocality experiments to a problem in the psychology of knowledge. The residual weirdness is only of the same kind as the weirdness of being able to know what happens if some object falls into a classical black hole and when it will hit the singularity, although no information can escape from a black hole.

Honestly, I didn't understand this argument at all. As I said in the thread, the weirdness is in the correlated results themselves. Sure, we can anticipate them due to past experiments, but how is this different from what maline was saying: "QM is not weird because it's correct"? That seems to be your argument, but then you say it isn't. I am at a loss.
 
  • #386
ddd123 said:
the weirdness is in the correlated results themselves.
Similarly, in relativity, the weirdness is in that different observers measure different clock times. it is weird only until you have a good mental scheme to think about it. People coming across relativity for the first time find it weird (and therefore intriguing, since it seems like a magical part of reality), but after getting accustomed to it, it is considered common sense.

stevendaryl had complained...

stevendaryl said:
The problem that I have with QM is that it is so unclear what its semantics are. Is the wave function a description of the state of the world, or is it a description of our knowledge about the world? Or somehow both? Neither alternative really fits all the facts comfortably. Then there is the discrepancy between the objects described by the mathematical formalism (amplitudes for different possibilities) and what is actually observed (definite values for whatever is measured). Special Relativity similarly shows up a huge difference between what the theory says and what our observations show, but in the SR case, what things look like to an observer can be derived from what they are, at an objective level. In QM, there seems to be a fundamental distinction between observations and the underlying equations of physics, which means that the former is not completely explained by the latter.

...that the weirdness in quantum mechanics is different since there is no good mental picture (''semantics''), and therefore people struggle with different interpretations for now nearly a century. I separated in the other thread subjective and objective, and clarified the semantics of what causality should mean, and how the subjective aspects of knowledge create the apparent causality problems. Unfortunately, it didn't seem to help him. But the discussion clarified a lot for me.
 
  • #387
Yes I understood the purpose of your argument, I just don't understand the argument.
 
  • #388
Several argumentative and off-topic posts, and the responses to them, have been removed. I remind all members to please stay on topic and civil in your discussions. Please see PF Terms and Rules for more info.
 
  • #389
One implication of the title of this thread is that in some quarters QM is considered to be weird. Does this suggested weirdness apply to the subject as a whole or only to certain specific aspects of the subject? If the latter is the case then what parts of QM are supposed to be weird? I'm reasonably familiar with some aspects of so called quantum weirdness as reported in the non specialist literature but I would be interested to know if there are any specialist QM practitioners who find all or parts of the subject to be weird.
Thank you.
 
  • #390
Dadface said:
I would be interested to know if there are any specialist QM practitioners who find all or parts of the subject to be weird.
Popular quantum magicians are at the same time very experienced specialist QM practitioners in quantum optics. They at least like to create for their audience the impression that parts of quantum mechanics is weird. This is common to magicians in any field, and not specific to quantum mechanics.

But since they understand their profession, I don't think any of our quantum magicians thinks that quantum mechanics is truly weird. It is fully rational to the mind sufficiently trained in mathematics and theoretical physics. This is why I think (and expressed in the title of the thread) that it is only the presentation that makes quantum mechanics appear weird.
 
  • #391
A. Neumaier said:
Popular quantum magicians are at the same time very experienced specialist QM practitioners in quantum optics. They at least like to create for their audience the impression that parts of quantum mechanics is weird. This is common to magicians in any field, and not specific to quantum mechanics.

But since they understand their profession, I don't think any of our quantum magicians thinks that quantum mechanics is truly weird. It is fully rational to the mind sufficiently trained in mathematics and theoretical physics. This is why I think (and expressed in the title of the thread) that it is only the presentation that makes quantum mechanics appear weird.

Forgive me if I am wrong about this but I have the impression that the main target audience for your book are the expert QM practitioners and teachers. I think your book might have much wider appeal if you included a brief opening section summarising those aspects of the subject which may appear to be weird.
 
  • #392
Dadface said:
Forgive me if I am wrong about this but I have the impression that the main target audience for your book are the expert QM practitioners and teachers. I think your book might have much wider appeal if you included a brief opening section summarizing those aspects of the subject which may appear to be weird.
My book is for those who (perhaps do not yet but) want to understand quantum mechanics on a serious level and have sufficient background in linear algebra and analysis. The course the book is based on was for mathematics master students. But most physics students can probably read it too after they mastered a course on classical mechanics covering the Lagrangian and Hamiltonian approach and the Poisson bracket.

In the book, I don't even mention weirdness! Thus people can see that one can set up everything of theoretical and practical interest in quantum mechanics without encountering anything weird. Opening the book with a chapter on quantum weirdness would defeat that purpose.

The book is a blueprint for possible courses on weirdless quantum mechanics. But it would be far more work than I can presently afford to actually turn it into a textbook that could replace a standard introduction to quantum mechanics. Thus it is explicitly designed as complementary reading for a standard textbook on quantum mechanics. But those prepared to invest some serious effort can study the book by itself. I even had feedback from several 16 years old self-learners who profited from the book.
 
  • #393
Neumaier: would you consider the Copenhagen interpretation weird? After all you propose your own thermal interpretation. If so, it may not just be a matter of exposition but of mathematical interpretation (collapse is pretty weird for example).
 
  • #394
A. Neumaier said:
My book is for those who (perhaps do not yet but) want to understand quantum mechanics on a serious level and have sufficient background in linear algebra and analysis. The course the book is based on was for mathematics master students. But most physics students can probably read it too after they mastered a course on classical mechanics covering the Lagrangian and Hamiltonian approach and the Poisson bracket.

In the book, I don't even mention weirdness! Thus people can see that one can set up everything of theoretical and practical interest in quantum mechanics without encountering anything weird. Opening the book with a chapter on quantum weirdness would defeat that purpose.

The book is a blueprint for possible courses on weirdless quantum mechanics. But it would be far more work than I can presently afford to actually turn it into a textbook that could replace a standard introduction to quantum mechanics. Thus it is explicitly designed as complementary reading for a standard textbook on quantum mechanics. But those prepared to invest some serious effort can study the book by itself. I even had feedback from several 16 years old self-learners who profited from the book.

I understand. Thank you and good luck with your book.:smile:
 
  • #395
ddd123 said:
Neumaier: would you consider the Copenhagen interpretation weird? After all you propose your own thermal interpretation. If so, it may not just be a matter of exposition but of mathematical interpretation (collapse is pretty weird for example).
Yes, the Copenhagen interpretation is weird. Not because of Bell-type experiments but for much more elementary reasons. A particle has no properties unless measured; in particular it has no position and no momentum. Then how can a particle emitted from a source in a particular direction know that it has to appear some roughly predictable time later on the screen on the other side of the room in this direction? How can we analyze any experiment if we do not assume that the particles we prepare in our laboratory are indeed in the laboratory and stay there, so that position makes at least approximately sense? The Copenhagen interpretation is nothing-or-all, which is completely incompatible with how we think about quantum mechanics in actual experiments. It is valid only in very special circumstances where attention is focused exclusively on a few discrete quantum degrees of freedom. The collapse, an integral part of the Copenhagen interpretation, is provably invalid for position measurement. Upon a position measurement, the state of a system never goes into an eigenstate of the position operator since such eigenstates don't exist. And lots of similar things are wrong with the Copenhagen interpretation. It is a can of worms if you open it...

Thus in my view, the Copenhagen interpretation in the form of the traditional textbooks postulates is a very idealized approximation to a description relating quantum mechanics and reality. It is a relic of the early days where quantum experiments were restricted to very simple systems and a theory for realistic measurements didn't exist. It survives only because it is in so many textbooks, since it allows writers and teachers to spell out the foundations of quantum mechanics in 3-5 axioms (depending on who formulates the details) together with two standard experiments to make the axioms look plausible - and then never return to it but to practice shut-up-and-calculate. The price for this apparent simplicity is that all those who want to have a better understanding of quantum mechanics are haunted for the rest of their lives by the resulting quantum weirdness.
 
  • #396
But prof. Neumaier or Arnold, however you prefer (I know that German is much more polite language), we have the concept of the so-called unsharp measurements, that is a way to circumvent strict collapse for observables with (partially) continuous spectrum. The only international (text)book that I know that briefly discusses this is "Quantum Mechanics" by Claude Cohen-Tannoudji, Bernard Diu and Frederic Laloë, p. 263 till 266 of the 1st edition of the English translation.
 
  • #397
dextercioby said:
we have the concept of the so-called unsharp measurements, that is a way to circumvent strict collapse for observables with (partially) continuous spectrum.
Yes, that's why I called the Copenhagen interpretation
A. Neumaier said:
a relic of the early days where quantum experiments were restricted to very simple systems and a theory for realistic measurements didn't exist.
Unsharp measurements model realistic measurements in a much better way and can account for particles having an unsharp position and momentum. But such measurements flatly contradict the Copenhagen interpretation and at least some formulations of the Born rule, for example the version stated in Wikipedia's article on Born's rule:
Wikipedia said:
The Born rule states that if an observable corresponding to a Hermitian operator
7fc56270e7a70fa81a5935b72eacbe29.png
with discrete spectrum is measured in a system with normalized wave function
df3b6a8bf2b28750c4bd39e4745dacd4.png
(see bra–ket notation), then
  • the measured result will be one of the eigenvalues
    e05a30d96800384dd38b22851322a6b5.png
    of [PLAIN]https://upload.wikimedia.org/math/7/f/c/7fc56270e7a70fa81a5935b72eacbe29.png, and
  • the probability of measuring a given eigenvalue
    40df999cdd2cf5df4f75806b9e280679.png
    will equal
    533a320e388c37a281c4951f339c47fa.png
In the case where the spectrum of
7fc56270e7a70fa81a5935b72eacbe29.png
is not wholly discrete, the spectral theorem proves the existence of a certain projection-valued measure [PLAIN]https://upload.wikimedia.org/math/f/0/9/f09564c9ca56850d4cd6b3319e541aee.png, the spectral measure of [PLAIN]https://upload.wikimedia.org/math/7/f/c/7fc56270e7a70fa81a5935b72eacbe29.png. In this case,
  • the probability that the result of the measurement lies in a measurable set
    69691c7bdcc3ce6d5d8a1361f22d04ac.png
    will be given by [PLAIN]https://upload.wikimedia.org/math/5/f/f/5ff5270179153c7fa9ac6a55fbb6f551.png.
In fact, once one allows for unsharp measurements one is already very close to my thermal interpretation - where position and momentum always exist independent of measurement, except that they are always unsharp. Infinitely precise position and momentum is a classical idealization, convenient when it applies but nowhere needed in physical practice.
 
Last edited by a moderator:
  • #398
A. Neumaier said:
together with two standard experiments to make the axioms look plausible

Which are those?
Are the experiments replicable?
Are the attendant axioms provably untenable?
Are the observations of the experiments' outcome unexplainable?
Are all these, after all, still considered part and parcel of the QM?
If they are, do you have an explanation for the conundrum they pose?
If they are not, can you articulately dismiss them?
If you don't have a cogent explanatus for the above, would this turn any theory that doesn't have it - including yours - into just another Zeitvertreib?

Can you elucidate why in this, or a related thread, you qualify some conjectures/theories, constitutive of your panorama, as out-of-date - but imperturbably proceed to proffer some new ones?
Wouldn't the above Kunststück, coupled with the awareness of its occurrence, generate some hesitation in your pronouncements?

I am merely asking ...
 
  • #399
First you have to be clear about which flavor of "Copenhagen" you mean. I think that flavors of Copenhagen interpretations that don't envoke the collapse postulate are the lest weird interpretations. Among them is the minimal interpretation, taking Born's rule simply as additional postulate and just take the meaning of the state as probabilistic, and that's just the vital core of interpretations necessary to use the formalism to real-world observations.

It's also of course an empty phrase to state that you don't know anything about a particle if nothing about its state is given. Quantum theory as physics is a whole is about observations of specific situations in nature. In an experiment, e.g., in particle physics, you pretty carefully prepare particles using an accelerator with a pretty well determined momentum. In fact in an accelerator like the LHC particles run in 2808 packets (bunches) per beam, each bunch containing about ##10^{11}## protons. Each beam is some cm long and about a mm wide. At the collision point it's squeezed to ##\mu\mathrm{m}## size. At each bunch crossing up to 20 collisions occur. So you have a pretty good determination of the protons' location with a pretty well determined momentum at the interaction point. Without that you'd not be able to get proton collisions in a collider with a sufficiently well defined collision (center of momentum) energy to be meaningful for particle physics. All this is, of course, fully consistent with quantum theory, and for sure it's nothing weird about it, although just remarkable and amazing to which precision one can construct accelerators and detectors testing the predictions of quantum theory (in this case the Standard Model of elementary particles, i.e., relativistic quantum field theory).

So what's done is indeed to prepare particles (protons) in a well defined state so that they can collide and then one measures the outcome of such a collision, and you do that many times to "collect statistics". It's precisely what's reflected in the formalism of QT without any weird assumptions on collapses, many worlds, de Broglie-Bohm trajectories (btw. the trajectories of the protons in the acclerator are calculated at an accuracy, enabling the accelerator physicists to design such high-precision machines, with good old classical physics which principles you learn in your E&M lecture in the 3rd-4th semester at the university, although of course much refined!), Qbism and what do I know other more or less esoteric ideas on the socalled "meaning of quantum mechanics", which in the popular culture sometimes even takes features of a kind of religion rather than good science!
 
  • #400
A. Neumaier said:
Does quantum mechanics have to be weird?

.
,,It is safe to say that nobody understands quantum mechanics.'' Richard Feynman.
I think QM will not be so weird, if it will be as usual as history lessons in a school programs. The only problem here is that a lot of people are not ready for this.
 

Similar threads

Replies
6
Views
3K
Replies
0
Views
8K
Replies
2
Views
2K
Replies
7
Views
820
Replies
2
Views
1K
Back
Top