I Question about discussions around quantum interpretations

  • #51
PeroK said:
The question is whether those uncontrollable stochastic features are fundamental to the statistics of the outcomes? Standard QM allows us to ignore those, focus on the evolution of the isolated quantum state and by those calculations alone obtain the statistical outcomes that match experiment. That's the standard theory as I understand it.
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
 
Physics news on Phys.org
  • #52
Demystifier said:
It is a part of the standard theory
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
 
  • #53
Demystifier said:
I already said several times, but I will repeat. Those uncontrollable stochastic features are not important for computing the probabilities of the outcomes. Nevertheless, they may be important for explaining randomness, for otherwise it is hard to explain why simple isolated systems don't show randomness. It is a part of the standard theory that random outcomes only appear when there is decoherence caused by the environment.
I understand the argument. We possibly risk going round in circles and it becomes a debate about the definition of randomness. Here's an analogy. We put an item on a supermarket shelf with a definite price of $2. (We leave aside the preparation problem in this analogy!) The price evolves so that after one day it is some known distribution of prices from $1 to $3. We take it to the checkout and the price is resolved, into $2.50, say.

Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution. The evolution from a fixed price into a probability distribution does not count as randomness. With that definition, I'm compelled to agree.

But, for me, it's not a satisfactory answer to say that it was all determined until we got to the checkout. I say there already was a bona fide probability distribution in the system before we got to the checkout. The probability distribution evolved - and that is non-determinism. If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.

And, if we allow the checkout machine to evolve in the same way - into a probability distribution of possible checkout machines, then we cannot tell from the start - by knowing everything about the item on the shelf and everything about the checkout machine - what price will appear at the checkout.

What we can say is that the evolution of the checkout machine doesn't seem to matter, in terms of the specific probabilities of prices that we get. It appears that we only need the probability distribution of the item on the shelf. That's the analogy for standard QM.

Perhaps that is down to interpretation. You can make the maths work either way.
 
  • #54
PeroK said:
If determinism produces a probability distribution, then it is no longer determinism in the way I would understand it.
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
 
  • #55
A. Neumaier said:
But mathematicians talk, e.g., about the probability with which a particular digit appears in the deterministic sequence of digits of pi. This is only one example of many of the use of probabilities in deterministic systems. Whenever one has a sensible measure normalized to 1, one has a probability distribution - this has nothing to do with not being deterministic!
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
 
  • #56
PeroK said:
I understand the argument.
I'm not sure you do.

PeroK said:
Your argument is that it must have been randomness in the checkout process that selected a price from the given distribution.
No, that's not my argument. I'm saying that complex environment is needed to explain randomness, but this environment does not necessarily need to be related to the checkout process. In this analogy, the price is a result of complex processes on the whole market (e.g. changes of supply and demand in the whole country), and not of random changes of the item on the shelf.

PeroK said:
It appears that we only need the probability distribution of the item on the shelf.
If you are an economist who wants to understand it, then you also want to know something about the processes on the whole market.
 
  • #57
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge.
Not necessarily - it might be that the question whether these probabilites are all equal (and hence equal 0.1) is undecidable in ZFC.
PeroK said:
The trillionth digit of pi is definitely one of the ten digits,
Note that these probabilities are independent of the knowledge of the first trillion digits!
PeroK said:
but without calculation it's equally likely to be any of them.
So you take ignorance to mean equally likely? This means that your probabilities are subjective probabilities.

But the probability of the digit 0 in the ensemble of all digits of pi is not a matter of guesswork but a matter of mathematical proof - it is either objectively determined or undecidable in ZFC.
 
  • #58
A. Neumaier said:
In this case you'd quote (in words) a standard reference for the questionable 'only' part of this claim!
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly, so it follows that only open systems (if any) can behave randomly.
 
  • #59
PeroK said:
Yes, but those are probabilities that can be resolved by more knowledge. Determinism means you can get rid of the probabilities with enough knowledge. The trillionth digit of pi is definitely one of the ten digits, but without calculation it's equally likely to be any of them.
I think I see now where is the problem. Many people don't understand very well how randomness arises in classical deterministic mechanics, they find it confusing. So when someone tells them that quantum randomness is a true randomness that does not arise from a classical-like determinism, they see it as a relief, they see quantum randomness as something simpler and more intuitive than classical randomness. That's why they prefer interpretations of QM in which randomness is intrinsic and fundamental. For such people, I would like to note that even classical mechanics can be interpreted as a fundamentally probabilistic theory, very similar to quantum mechanics, except that the corresponding "Schrodinger" equation is not linear: https://arxiv.org/abs/quant-ph/0505143#
 
  • #60
Demystifier said:
The "only" part can be derived from the 7 basic rules of QM that you yourself wrote here
https://www.physicsforums.com/insights/the-7-basic-rules-of-quantum-mechanics/
Rule 3 (isolated system evolves deterministically) implies that isolated systems don't behave randomly,
Rule 3 only states that the state of an isolated system evolves deterministically.
Demystifier said:
so it follows that only open systems (if any) can behave randomly.
It only follows that the state of open system only (if any) can behave randomly.
But this is quite different from your much stronger claim
Demystifier said:
that random outcomes only appear when there is decoherence caused by the environment.
claim! This rule says nothing at all about outcomes , neither it implies that decoherence is necessary for random outcomes (which your statmeent implies).

Rule 3 only states that the state of an isolated system evolves deterministically.

Thus you need to provide better references, or weaken your claim - but in the latter case the conclusions you draw from it are no longer cogent.
 
  • #61
A. Neumaier said:
Rule 3 only states that the state of an isolated system evolves deterministically.
But according to standard QM the state represents a complete description of the system, there is nothing else except the state. Hence the deterministic description of an isolated system by the state is a complete description of an isolated system, at least according to standard QM.
 
  • #62
Demystifier said:
But according to standard QM the state represents a complete description of the system, there is nothing else except the state.
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
 
  • #63
A. Neumaier said:
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
Which is the standard interpretation.

Of course, I am not advocating the standard (Copenhagen) interpretation, but I am pointing out that someone who does should find very plausible the idea that randomness is somehow related to the influence of the environment.
 
Last edited:
  • #64
A. Neumaier said:
Demystifier said:
But according to standard QM the state represents a complete description of the system, there is nothing else except the state.
This is not part of the 7 rules - the part common to all interpretations.
It is only part of the Copenhagen interpretation.
Actually only the first half is part of the Copenhagen interpretation.

The second half is not, since an integral part of the Copenhagen interpretation (not shared by most other interpretations) is that every quantum system must be interpreted in a classical experimental context.

Thus your statement involving outcomes and decoherence has no established source.
Demystifier said:
Which is the standard interpretation.
No. It is the Copenhagen interpretation. It was the standard interpretation until around 1970. Today there is no longer a standard interpretation since a large majority of quantum physicists does not accept one or the other part of the Copenhagen interpretation - in particular the part on the necessity of the classical context.

Only the seven rules are standard.
 
  • #65
A. Neumaier said:
No. It is the Copenhagen interpretation. It was the standard interpretation until around 1970. Today there is no longer a standard interpretation since a large majority of quantum physicists does not accept one or the other part of the Copenhagen interpretation - in particular the part on the necessity of the classical context.
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue that environment is not needed for randomness. Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
 
  • #66
Demystifier said:
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue
All have their own interpretation in mind, unless they specifically mention a particular interpretation. Even in that case, they have their own interpretation of that particular interpretation in mind since no interpretation (not even the Copenhagen interpretation) has a standard version acceptred by all their adherents.

That's the sad state of affairs....
Demystifier said:
Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
In the Copenhagen interpretation, both source and detector are part of the classical environment, each of which gives enough scope for hidden randomness. As I understand him, @PeroK only claims detector-independent randomness of decays, not source-independent randomness.
 
  • #67
Demystifier said:
Fair enough. But in that case people like @PeroK should specify which interpretation they have in mind when they argue that environment is not needed for randomness. Because it looks as if he has some Copenhagen-like interpretation in mind, and yet environment-independent randomness does not seem compatible with Copenhagen.
I don't have a specific interpretation in mind. My point is that, generally, the evolution of a superposition can be considered the natural evolution of different, probabilistic outcomes. Take, for example, the idealised electron single-slit experiment.

Before the slit, the electron has a state that resembles a classical particle, with low uncertainty in lateral momentum. The interaction with the slit forces the state to evolve into a superposition of states with varying quantities of lateral momentum. It has evolved (deterministically in the key criteria) into a state with considerable uncertainty in lateral momentum.

Finally, the electron interacts with the detector in a position that appears not to be predictable from the initial state. Despite the uncertainties in the initial state, they do not appear to be sufficient to predetermine the electron to impact at a specific lateral position on the detector. Nor does the configuration of the detector appear to determine which cell is illuminated. The variable interaction with the detector appears to be determined, in standard QM fashion, by the intermediate evolution associated with the interaction with the single slit.

My point is simply that the interaction with the slit alone appears sufficient to produce different, probabilistic outcomes. That is sufficient to class the experiment as non-deterministic.

It may be the case that the states of the source and detector determine the outcome, but it does not seem necessary to consider those. Nor the precise state of the intermediate slit.

Especially, and I labour the point, since the probabilities appear to respect the intermediate evolution of the electron state. No additional mathematics is needed to calculate the required probabilities!
 
  • #68
A. Neumaier said:
As I understand him, @PeroK only claims detector-independent randomness of decays, not source-independent randomness.
I thought I was taking more an experimental view of establishing which components appear to influence the probabilities of the outcomes. In the single-slit it appears that the calculation associated with the intermediate interaction with the slit is sufficient to produce the probabilities that describe the outcome.

I acknowledge we can't have an experiment without a source and detector. But, we can calculate the relevant probabilities without reference to the precise state of either.

I also understand that a deeper analysis (especially in trying to make sense of QM fundamentally) might lead one to consider the precise states of the source and/or detector and/or intermediate slit. And that an interpretation might be constructed where the probabilities of the outcomes, although prima facie independent of the experimental apparatus itself) might indeed be alternatively explained through these considerations.
 
  • #69
PeroK said:
Take, for example, the idealised electron single-slit experiment.

Before the slit, the electron has a state that resembles a classical particle, with low uncertainty in lateral momentum.
It is generally assumed to be in a plane wave state with with low uncertainty in momentum. This is very far from a classical particle.
PeroK said:
The interaction with the slit forces the state to evolve into a superposition of states with varying quantities of lateral momentum.
No. The filter containing the slit turns most of the plane wave into heat, with exception of the little part that passes through the slit.
PeroK said:
It has evolved (deterministically in the key criteria) into a state with considerable uncertainty in lateral momentum.
This part is generally modeled classically (like a wave in classical optics), hence the determinism. Probability is not conserved.

After the slit, the surviving wave is still treated classically, resulting in a spherical wave. At best, the spin degrees of freedom receive a quantum treatment.
PeroK said:
Finally, the electron interacts with the detector in a position that appears not to be predictable from the initial state. Despite the uncertainties in the initial state, they do not appear to be sufficient to predetermine the electron to impact at a specific lateral position on the detector.
This is because a spherical wave has no particle character.
PeroK said:
Nor does the configuration of the detector appear to determine which cell is illuminated.
It appears so to you, without any stringent argument. This is your most questionable assumption.
PeroK said:
The variable interaction with the detector appears to be determined, in standard QM fashion, by the intermediate evolution associated with the interaction with the single slit.
The spatial probability distribution is determined by the squared amplitude at each detector position.
PeroK said:
My point is simply that the interaction with the slit alone appears sufficient to produce different, probabilistic outcomes.
The slit only produces the spherical wave, nothing more, which defines a mathematical probability distribution, but no actual outcomes, hence no randomness - the spherical symmetry is preserved before meeting the detector.

But the outcomes are produced by the detector, according to this distribution. The observed emerging detection pattern is fully consistent with the assumption that each local neighborhood of the detector responds independently with a tiny probability proportionally to the tiny strength of the impacting wave at that neighborhood. Thus it is natural to assign the randomness to to the unknown details of each neighborhood.
PeroK said:
That is sufficient to class the experiment as non-deterministic.
But not sufficient to pinpoint the reason.
PeroK said:
It may be the case that the states of the source and detector determine the outcome, but it does not seem necessary to consider those. Nor the precise state of the intermediate slit.
So how do the mere probabilities create the outcomes?
PeroK said:
Especially, and I labour the point, since the probabilities appear to respect the intermediate evolution of the electron state.
Only after the slit and before the detector. There probabilities (or rather probability amplitudes) evolve deterministically.
PeroK said:
No additional mathematics is needed to calculate the required probabilities!
But without additional mathematics you don't get any outcomes, hence no randomness.
PeroK said:
I thought I was taking more an experimental view of establishing which components appear to influence the probabilities of the outcomes. In the single-slit it appears that the calculation associated with the intermediate interaction with the slit is sufficient to produce the probabilities that describe the outcome.
There is a difference between producing probabilities and producing outcomes - this is what @Demystifier repeatedly pointed out. An outcome is an actual change in the detector, but a probability distribution is just an idea in our mind, and cannot cause such an outcome. While the interaction with the detector can and does!
PeroK said:
I acknowledge we can't have an experiment without a source and detector. But, we can calculate the relevant probabilities without reference to the precise state of either.
This is just shut-up-and-calculate, about which there was never any dispute. But calculations have no causal power, only interactions have!
 
Last edited:
  • #70
A. Neumaier said:
There is a difference between producing probabilities and producing outcomes - this is what @Demystifier repeatedly pointed out. An outcome is an actual change in the detector, but a probability distribution is just an idea in our mind, and cannot cause such an outcome. While the interaction with the detector can and does!
I don't agree with that at all. And I don't know how many physicists would. In all of physics you do calculations based on certain criteria (whether that's wave mechanics or Feynman diagrams or simple parabolic motion). These calculations produce the predictions of experiments. The predictions are generally not produced by an analysis of the interaction at the detection event.

Feynman diagrams are just abstract calculations. But, that's how scattering cross sections are calculated. Not from a detailed analysis of the state of the detector.

You're a heavyweight physicist and I'm a rank amateur. But, I know enough to be skeptical of what you are saying.
 
  • #71
PeroK said:
In all of physics you do calculations based on certain criteria (whether that's wave mechanics or Feynman diagrams or simple parabolic motion). These calculations produce the predictions of experiments. The predictions are generally not produced by an analysis of the interaction at the detection event.

Feynman diagrams are just abstract calculations. But, that's how scattering cross sections are calculated.
I fully agree with that. But it has nothing to do with what I wrote.

Calculations produce the predictions. But they have no causal physical power.

What happens is not a result of calculations with models of physics, but of interactions of actual physical systems!

To produce a decay (and hence a system to be measured) you need to have a real source whose interactions produce the decay. This is what @Demystifier tries to deny.

And to produce an outcome (and hence a set of random numbers to be compared with probabilistic predictions) you need to have a real detector whose interactions produce the outcome. This is what you try to deny.

No amount of calculated probabilities can change these two general observations.
 
  • #72
PeroK said:
My point is that, generally, the evolution of a superposition can be considered the natural evolution of different, probabilistic outcomes.
Even this is interpretation dependent. In the MWI, for example, the unitary evolution of the state is completely deterministic, even including measuring devices, since those just get included in entangled superpositions as measurement interactions occur. There is no randomness or probabilistic outcome anywhere.
 
  • #73
A. Neumaier said:
I fully agree with that. But it has nothing to do with what I wrote.

Calculations produce the predictions. But they have no causal physical power.

What happens is not a result of calculations with models of physics, but of interactions of actual physical systems!
Hey, thanks for engaging with the thread! I’m not a physicist, but I’m really enjoying reading the discussion.
I just wanted to make a quick comment: I think I understand tour point about how calculations have no causal power, but if calculations and models seem to match the experimental results, and our perceived experience, couldn’t we say that these calculations are an accurate enough approach to reality, thus making the assertion that they don’t have causal power a diversion that ignores their explanatory power?
 
  • Like
Likes Lord Jestocost and PeroK
  • #74
ojitojuntos said:
... if calculations and models seem to match the experimental results, and our perceived experience, couldn’t we say that these calculations are an accurate enough approach to reality ...
Just a note regarding the term 'reality':

One has to differentiate between agreement reality and experiential reality. Agreement reality is that which we consider to be real because we have been told that it is real and everyone seems to agree. Experiential reality is that which we know from actual direct experience itself.
 
  • Like
Likes ojitojuntos
  • #75
ojitojuntos said:
Hey, thanks for engaging with the thread! I’m not a physicist, but I’m really enjoying reading the discussion.
I just wanted to make a quick comment: I think I understand your point about how calculations have no causal power, but if calculations and models seem to match the experimental results, and our perceived experience, couldn’t we say that these calculations are an accurate enough approach to reality, thus making the assertion that they don’t have causal power a diversion that ignores their explanatory power?
Good theories (of which good models are anintegral part) have a high explanatory power, revealed by appropriate computations (that themselves don't have explanatory power) that can be matched with experiments.

But neither theories nor calculations have causal power since they are just texts on paper, in some electronic medium, or in someone's head.
 
  • #76
Lord Jestocost said:
Just a note regarding the term 'reality':

One has to differentiate between agreement reality and experiential reality. Agreement reality is that which we consider to be real because we have been told that it is real and everyone seems to agree. Experiential reality is that which we know from actual direct experience itself.

Interesting point. The trouble I have with semantic answers to deep questions is whether we have really made progress.

These days, for me, the math is the explanation. QM is outside everyday experience; all we can really do is hope it emerges from it:

https://www.sciencenews.org/blog/context/gell-mann-hartle-spin-quantum-narrative-about-reality

I think it does, although questions remain.

I am reminded of the early chapters of Feynman's Lectures on Physics. We think of a table as modelled by Euclidean geometry. But in reality, at the surface, atoms intermix with air, so there is no actual boundary. However, surveyors and carpenters manage just fine with our naive view. Maybe we, as quantum physicists, can take the same view? I don't even know what a particle is, except loosely as some excitation in a quantum field that is described mathematically. We can attempt to go beyond that, but has it really got us anywhere (so far anyway)?

Thanks
Bill
 
  • Like
Likes dextercioby and PeroK
  • #77
A. Neumaier said:
But neither theories nor calculations have causal power since they are just texts on paper, in some electronic medium, or in someone's head.

Of course they don't.

But they have the power to suggest insight about the world we inhabit, and, if we are lucky, ways to test that insight.

Thanks
Bill
 
  • Like
Likes ojitojuntos
  • #78
PeterDonis said:
the unitary evolution of the state is completely deterministic, even including measuring devices, since those just get included in entangled superpositions as measurement interactions occur.

Not in the assumptions.

However, when relating it to everyday experience, probabilities are imposed on us - we experience only a single world. There is even controversy about whether the worlds we do not experience are of any relevance:


That we experience one world is the basis of Deutsch's betting argument:
https://www.eurandom.tue.nl/reports/2003/028-report.pdf

MW, when I first heard about it, made my head hurt, and still does.

Thanks
Bill
 
  • #79
bhobba said:
when relating it to everyday experience, probabilities are imposed on us - we experience only a single world.
In the MWI, that doesn't "impose probabilities"--it just means that "we experience" has to be interpreted very...um, carefully.

In fact, explaining probabilities at all--for example, why the Born Rule works--is a key open issue with the MWI.

bhobba said:
There is even controversy about whether the worlds we do not experience are of any relevance
If they're not, then the interpretation you're using isn't the MWI. That's true even if you're Gell-Mann. :wink:
 
  • #80
Having a good map helps us explore the territory more effectively, but we shouldn't confuse the map with the territory. Both are reasonable positions.
 
  • #81
PeterDonis said:
In fact, explaining probabilities at all--for example, why the Born Rule works--is a key open issue with the MWI.

Well said, as usual, Peter.

One of my great joys in life, now that I am older, is engaging in thoughtful discussions. I am a Patrion of Dr Fatima (you can find her content on YouTube), who holds ideas different from mine in several areas. Not in physics - she is pretty conventional, but in other associated stuff we usually do not discuss here, such as the so-called Sokal affair. But when she explains it, I find it reasonable and rational. The same can't be said for some other discourses I have on different forums.

Thanks
Bill
 
  • #82
Thank you all for engaging with this discussion. I find it very refreshing to see the different approaches to interpretations. It’s taken me a while to read and understand the discussion the best that I can.

As a sociologist, I think that it is very interesting to see the shifts and trends in interpretations of QM. I saw that a recent 2025 Nature poll shows that Copenhagen-adjacent still presents a majority, followed by information-based approaches. MWI is still represented as a deterministic alternative. It catches my interest seeing the MWI gaining traction compared to other polls. Why do you think that is?

From what I've learned, there is no experimental finding that favors it (or any deterministic interpretation, but that is my very layman understanding), so I'd like to know your opinions on the phenomena behind the sustenance of that interpretation.
 
  • #83
ojitojuntos said:
It catches my interest seeing the MWI gaining traction compared to other polls. Why do you think that is?
It might be because the MWI is the predominant no collapse interpretation, and since every attempt to try to pin down collapse as an actual physical process so far has failed, more physicists might be turning to no collapse interpretations to see what they offer.
 
  • Like
Likes ojitojuntos
  • #84
PeterDonis said:
It might be because the MWI is the predominant no collapse interpretation, and since every attempt to try to pin down collapse as an actual physical process so far has failed, more physicists might be turning to no collapse interpretations to see what they offer.
Oh, that makes sense. Are there any probabilistic no-collapse interpretations? I also have another question about the implications of determinism in quantum: if we interpret quantum physics as inherently deterministic, would that entail that probabilities are only epistemic and that the universe as a whole follows a single "pre-established" outcome? Sorry if my language is too simple for these types of questions.
 
  • #85
ojitojuntos said:
Are there any probabilistic no-collapse interpretations?
Not that I'm aware of.
 
  • #86
ojitojuntos said:
if we interpret quantum physics as inherently deterministic, would that entail that probabilities are only epistemic and that the universe as a whole follows a single "pre-established" outcome?
No. That's not how the MWI works. In the MWI, all possible outcomes occur, each in its own branch of the wave function. It's not even clear how probabilities arise at all in the MWI (for example, accounting for the Born Rule has been an ongoing open issue with the MWI for decades); the standard epistemic interpretation of them that works in classical physics doesn't work in the MWI.

In statistical interpretations, such as the one used in Ballentine's textbook, probabilities are more or less viewed as epistemic (we're forced to use them because we don't know all the microscopic details), but there is no claim made at all about what underlies them (for example, there is no claim that the underlying dynamics, whatever it might be, is deterministic). In such interpretations, the quantum state doesn't describe individual systems anyway (it only describes ensembles), so you can't infer from the fact that the collapse postulate is applied after a measurement that that measurement had a single outcome; but as far as I can tell, measurements having single outcomes is pretty much viewed as a brute fact in such interpretations, and is not something the interpretation tries to account for, it just takes it as given.
 
  • Like
  • Informative
Likes bhobba and ojitojuntos
  • #87
ojitojuntos said:
if we interpret quantum physics as inherently deterministic, would that entail that probabilities are only epistemic and that the universe as a whole follows a single "pre-established" outcome?
Note that, while, as I said in my previous post just now, the MWI doesn't work this way, the Bohmian interpretation, which is also deterministic, does work this way. The underlying deterministic dynamics involves the unobservable particle positions; the probabilities are epistemic in the straightforward way that they are in classical physics (because we don't and can't know the exact microstate).
 
  • Like
  • Informative
Likes bhobba and ojitojuntos
  • #88
bhobba said:
Also, I must mention that the idea that reality is a social construct has very little support among physicists and scientists in general. Wienberg himself wrote an interesting essay about it:

https://web.physics.utah.edu/~detar/phys4910/readings/fundamentals/weinberg.html
Thank you! I'll read this later tonight- I want to pay attention to it. Since I haven't read it yet, I can't say for sure what you mean, but I don't think that most sociologists consider reality a social construct in a strict sense.
I think the more valid criticisms to the relationship between science and society are more about how we interpret scientific results. I'd love to discuss the essay via messages if you're interested- I really enjoy talking with people from different disciplines!

PeterDonis said:
Note that, while, as I said in my previous post just now, the MWI doesn't work this way, the Bohmian interpretation, which is also deterministic, does work this way. The underlying deterministic dynamics involves the unobservable particle positions; the probabilities are epistemic in the straightforward way that they are in classical physics (because we don't and can't know the exact microstate).

Interesting! Why is Bohmian less popular than MWI, though? Is it something mathematically or experimentally that disfavors it? I see pop-science articles saying multiple times that it has failed haha, but I prefer asking the experts :)
 
  • #89
ojitojuntos said:
As a sociologist, I think that it is very interesting to see the shifts and trends in interpretations of QM. I saw that a recent 2025 Nature poll shows that Copenhagen-adjacent still presents a majority, followed by information-based approaches.
There is no experimental finding that favours one interpretation over the other.

If there were, it would be a significant breakthrough.

Just keep in mind that non-relativistic QM is only an approximation to a deeper theory, which is the best theory we have today - Quantum Field Theory (QFT). And from Wienberg's Folk Theorem (meaning we do not have a rigorous mathematical proof - but it is thought to be true) any theory that obeys very general mathematical requirents (eg what is called analitcality and unitarity) plus physical requirements nobody really cares to doubt (cluster decomposition and special relativity) must look like a QFT at large enough distances - there is no out.
https://en.wikipedia.org/wiki/Folk_theorem_(physics)

Are the fields real? Most physicists, including myself, believe that they are (by something called Noether's Theorem, they have energy, and we have E=MC^2; therefore, if they are not real, then mass is not real - such a notion seems unnecessarily strange). However, even what 'reality' is, is a profound philosophical question, and we do not delve into philosophy here. I will mention only that several philosophers, even current ones, remain stuck in the time of the famous Bohr-Einstein debates, but significant advancements have occurred since then.

Also, you will often hear that QM is non-local. That is not true in the usual sense of non-locality. Bell showed it was Bell-Non-Local, which, as a sociologist, you can look into. The confusion about these areas, even from some very famous people such as Tim Mauldin, would be an interesting topic for your profession to explore.

Also, I must mention that the idea that reality is a social construct has very little support among physicists and scientists in general. Wienberg himself wrote an interesting essay about it:

https://web.physics.utah.edu/~detar/phys4910/readings/fundamentals/weinberg.html

The attraction of MWI is that you have the wavefunction, and that's it. Easy peasy. Now getting the everyday world from that alone is, how to put it, far from easy, and the answers are pretty controversial.

As I have mentioned, QFT is the 'correct' theory, and general considerations more or less determine the theory. The downside - it is an effective theory (EFT), and as scientists, we would prefer something better than just 'effective'. Although an argument can be made that all our theories are just 'effective', however that is philosophy territory.

Note to those reading this thread, my post occurred before the one above, but I was editing it via deleting the old one and doing a new one, so it appeared after.

Thanks
Bill
 
Last edited:
  • #90
bhobba said:
Wienberg himself wrote an interesting essay about it:

https://web.physics.utah.edu/~detar/phys4910/readings/fundamentals/weinberg.html
My favorite quote from that essay: "If you have bought one of those T-shirts with Maxwell's equations on the front, you may have to worry about its going out of style, but not about its becoming false." (The only problem with the quote is the implicit assumption that those T-shirts--I used to have one--were ever in style to begin with. :wink:)
 
Back
Top