I Question about discussions around quantum interpretations

ojitojuntos
Messages
11
Reaction score
4
TL;DR Summary
Question: If experimentally quantum mechanics has yielded more and more evidence of probabilistic behavior, why is it that there are so many interpretations looking to reintroduce fundamental determinism?
I understand that the world of interpretations of quantum mechanics is very complex, as experimental data hasn't completely falsified the main deterministic interpretations (such as Everett), vs non-deterministc ones, however, I read in online sources that Objective Collapse theories are being increasingly challenged. Does this mean that deterministic interpretations are more likely to be true?

I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
 
  • Like
Likes PeroK
Physics news on Phys.org
Any interpretation could be true. The reason there are so many is that QM is outside everyday experience, so we have no intuition to fall back on.

That said, my suggestion, and I know beginners and those with an interest in what QM is telling us don't like, is not to worry about interpretation until you understand QM to the level of a comprehensive textbook, such as Ballentine's QM: A Modern Development. Ballentine develops QM from just two axioms. And the second axiom actually follows from the first by what is called Gleason's Theorem. QM from just one axiom? Well, strictly speaking, there are seven rules (and a post on this forum discusses them), but they can be presented in a way that makes the others seem natural, which is what Ballentine does. In fact, an interesting exercise for advanced students is to find in the textbook where he uses those rules by the back door.

However, it is essential in understanding interpretations to realise that QM, at its core, is essentially just one rule (or axiom) and what that axiom is. In fact, you may, like I now do, say that QM is that one rule, plus some almost inevitable assumptions, and then evoke Newton's - I make no hypothesis. This becomes more attractive once you realise that every single theory makes assumptions not explained by the theory. I can provide a heuristic justification in the mathematical modelling sense for that rule (in mathematical modelling, when deciding the assumptions of our model, this sort of thing is often done), but I can't derive it; it must be assumed. However, the choice is yours to decide if this is satisfactory or not.

The second important thing is that QM is wrong. We know this because it predicts that the hydrogen atom is in a stationary state, meaning it should not absorb or emit photons. But we know it does. This is addressed by applying an extension of QM called Quantum Field Theory (QFT). However, the modern view of QFT is that it is the inevitable low-energy approximation of any theory at large distances that obeys well-established laws, such as Special Relativity. We have no choice. It is called Weinberg's Folk Theorem, and his advanced three-volume tome on QFT develops QFT from this view, which goes by the name Effective Field Theory (EFT).

Added Later:
For technical details of this view, I found a nice paper:
https://research.engineering.nyu.edu/~jbain/papers/Weinberg.pdf

It even resolves, for energies we can currently probe, combining QM and GR, which, as you may have read, is a big problem. It is, but since we only know EFTs, all our theories have the same problem.

Where does this leave us in terms of interpretation? Basically, our most powerful theory is close to inevitable. To go beyond it, we need information from areas we can't currently probe (directly anyway). Sure, we can hypothesise (String Theory is one such attempt), but as of now, it looks like Einstein was right, QM is incomplete - but not for the reasons he thought. Still, one never knows - it may be the great man has the last laugh.

The answer to your final question is that, since we do not know the theory at the rock bottom of QFT (or even if there is one - it may be turtles all the way down - what a depressing thought - but nature is as nature is), we do not know. Gleason's Theorem suggests it is, but we do not know for sure.

Thanks
Bill
 
Last edited:
  • Like
  • Informative
Likes sbrothy, ojitojuntos and fresh_42
bhobba said:
The second important thing is that QM is wrong.
This is phrased rather confusingly. What you mean is that non-relativistic QM is wrong. But I don't think the OP means to restrict discussion of "quantum mechanics" to just non-relativistic QM.
 
  • Like
Likes sbrothy, ojitojuntos and bhobba
Well, picked up Peter.

Of course, you are correct.

Thanks
Bill
 
  • Like
Likes ojitojuntos
ojitojuntos said:
I always understood that the "collapse" or "measurement problem" was how we phrased the fact that there is fundamental randomness in the universe, and that the Bell tests and double-slit experiments support this, but if Objective Collapse theories are being tested and refuted, does this mean that the universe is more likely deterministic and the observed randomness is epistemic?
There are established probabilistic interpretations distinct from objective collapse, and far more mainstream (to the extent that interpretations can be mainstream). Asher Peres's modern treatise "Quantum Theory: Concepts and Methods", Julian Schwinger's opening essay in "Symbolism of Atomic Measurement", and Roland Omnes's "Understanding Quantum Mechanics" are some of my recommended reading for probabilistic accounts of quantum theories.

There are presumably philosophical motivations for pursuing a deterministic understanding over a probabilistic one.
 
  • Like
Likes ojitojuntos
Thanks a lot for your replies! Sorry for the delayed response, work has been quite busy.
As Peter said, I didn’t mean to limit the discussion to non-relativistic QM.
I read some articles here and there (pop science, you’d call them), and I understand that deterministic interpretations tend to be preferred by people who take the mathematical formalism at face value (Everettian), or those who support non-local hidden variables.
Is this correct? If so, there any experimental support to any of these interpretations?

From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
 
ojitojuntos said:
From what I understand, experiments have led to reaffirm the inherent probabilistic nature of quantum reality, so reincorporating determinism in those ways feels counter-intuitive. Apologies if my questions are too misguided.
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
 
  • Like
Likes ojitojuntos and bhobba
PeroK said:
That is the issue. Many people, for whatever reasons, believe that the universe must be fundamentally deterministic. QM challenges that belief. Many physicists accept that QM is evidence that a belief in determinism is not required for physics to make sense. Others will try to find a way to make QM fit into a deterministic model.

Then the question is: who is being counterintuitive here?
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
 
Last edited:
  • Like
Likes syed and ojitojuntos
Demystifier said:
QM in its usual form seems to be saying that the universe is non-deterministic only when measurements are performed, while the rest of the time it behaves deterministically. For example, a photon isolated from the environment behaves deterministically, until it interacts with a detector. And yet, QM doesn't give a precise definition of "measurement" and "detector". That doesn't seem right from a fundamental point of view. This suggests that non-determinism might be just an effective description emerging from the lack of knowledge of details of complex environments and detectors containing a large number of degrees of freedom, while at the fundamental microscopic level the dynamics is deterministic. That's very intuitive to me.

A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM. Objective collapse theories seem to be very ad hoc and they are largely ruled out by experiments. Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.) The Nelson interpretation is very much like the Bohmian interpretation, except that particles have additional stochastic jiggling.
Thanks a lot for replying!
I believe that the fact thanQM hasn’t provided a satisfactory explanation (in the sense that it is canonically relevant) of what a measurement or a detector are, doesn’t necessarily suggest that the underlying mechanics must be deterministic due to our lack of information.
We see non-determinism at a fundamental level, but our lack of understanding of this non-determinism is not by itself an argument that suggests determinism, I think.
Of course, I’m aware that as my formation is in social sciences, Im more inclined to think that nature has a mix of probabilistic and deterministic aspects (the former not being exclusively due to our lack of knowledge).

I think that you’ve provided a very interesting argument on how it is not obvious that probabilistic quantum is the-way-to go, regardless of my inclination. Thanks a lot!
 
  • Like
Likes PeroK and Demystifier
  • #10
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic. The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.

Even though the state of the isolated atom evolves deterministically, that evolution inherently introduces probabilities that cannot be attributed to the detection process. The detection process is a part of nature where the inherent probabilities manifest themselves.

Interaction, measurement, detection is part of nature. Even if QM struggles to describe what happens in a satisfactory manner, it's stretching a point to say that radioactive decay is inherently deterministic.
 
  • Like
Likes ojitojuntos, Lord Jestocost and martinbn
  • #11
PS we could illustrate the point by borrowing Schrodinger's cat. After each run of the experiment, the cat is either alive or dead. There is no way to determine this from the start by knowing everything there is to know about the cat and the apparatus. It doesn't depend on that. It depends only on the probabilistic decay process.

So, unless the radioactive atom has hidden properties that determine its time of decay, we have an inherently non-deterministic experiment.

That is not to say that by sufficiently clever means a deterministic explanation cannot be found. E.g multi worlds or Bohmian mechanics.
 
  • Like
Likes bhobba and ojitojuntos
  • #12
PeroK said:
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic. The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.

Even though the state of the isolated atom evolves deterministically, that evolution inherently introduces probabilities that cannot be attributed to the detection process. The detection process is a part of nature where the inherent probabilities manifest themselves.

Interaction, measurement, detection is part of nature. Even if QM struggles to describe what happens in a satisfactory manner, it's stretching a point to say that radioactive decay is inherently deterministic.
If you agree that isolated atom evolves deterministically, then non-determinism must be associated with interaction with the environment, would you agree? So even if each atom is identical, that cannot be said for each environment. Hence you cannot rule out the possibility that the time of decay is somehow determined by the details of the environment, but looks random to us only because we don't know these details in practice.
 
  • Like
  • Skeptical
Likes syed, PeroK and gentzen
  • #13
PeroK said:
There is no way to determine this from the start by knowing everything there is to know about the cat and the apparatus. It doesn't depend on that.
I don't understand your logic. Sure, a human cannot know everything there is to know about the cat and the apparatus. But just because a human cannot know something doesn't imply that nature doesn't depend on that. Take a classical coin toss for example, a human cannot know all the fine details of initial data that determine the result of coin toss, that's why the coin toss looks like a random event to us, and yet the result of coin toss in classical physics is inherently deterministic.
 
  • Like
Likes syed
  • #14
Demystifier said:
If you agree that isolated atom evolves deterministically, then non-determinism must be associated with interaction with the environment, would you agree? So even if each atom is identical, that cannot be said for each environment. Hence you cannot rule out the possibility that the time of decay is somehow determined by the details of the environment, but looks random to us only because we don't know these details in practice.
The precise details of the uncertainty are encoded in the atomic state. In the weighing/probability amplitude associated with the decay or not substates. You don't need uncertainty in the environment. If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.

It is simpler and more intuitive to me to assume that the environment manifests the probabilities inherent in the atomic state.

Ultimately, the weakness of your argument is that you postulate a mechanism that doesn't add anything in terms of experimental results. Moreover, if that mechanism were genuinely part of the model, there should be experiments that defy the simple model that all information about the probability of decay is contained within the atomic state.
 
  • #15
PeroK said:
Take a number of radioactive atoms as an example. If each atom is identical, then they should all decay at the same time - if the decay process were deterministic. The detector isn't causing the decay. The detector cannot be causing the uncertainty in whether an atom decays in a certain time.

Even though the state of the isolated atom evolves deterministically, that evolution inherently introduces probabilities that cannot be attributed to the detection process. The detection process is a part of nature where the inherent probabilities manifest themselves.

Interaction, measurement, detection is part of nature. Even if QM struggles to describe what happens in a satisfactory manner, it's stretching a point to say that radioactive decay is inherently deterministic.
Your first sentence simply does not follow. By that logic, if one tosses the same coin many times, it should always land on heads if the coin toss was deterministic.

A deterministic process does not imply that the same result happens every single time. If it is a chaotic system, like weather patterns, initial conditions can combine together in complicated ways such that each result is different, yet still produce stable rates when looking at the results in aggregate.

Arguably, the very fact that the probabilities stemming from Born's rule for example never change, implies that there probably is a deeper deterministic process. Otherwise, why would those rates stay the same?
 
  • Skeptical
Likes ojitojuntos, PeroK and weirdoguy
  • #16
syed said:
By that logic, if one tosses the same coin many times, it should always land on heads if the coin toss was deterministic.
This is a faulty analogy. The coins are tossed one at a time, and, as you say, the initial conditions can vary for each toss, so a deterministic model can still give different results for each toss.

But you can prepare a sample of a huge number of identical radioactive atoms all at the same time, yet they don't all decay at the same time. That can't be explained by variation in initial conditions, because there is none, by construction.
 
  • Like
Likes ojitojuntos and PeroK
  • #17
Demystifier said:
I don't understand your logic. Sure, a human cannot know everything there is to know about the cat and the apparatus. But just because a human cannot know something doesn't imply that nature doesn't depend on that. Take a classical coin toss for example, a human cannot know all the fine details of initial data that determine the result of coin toss, that's why the coin toss looks like a random event to us, and yet the result of coin toss in classical physics is inherently deterministic.
The difference is that we do know everything about the state of the atom and its decay rate. And that is sufficient to explain the quantitative probabilities. Whereas, there is nothing inherent in the coin that produces the probabilities. The probabilities are created by what you do with the coin.

The analogy fails because the state of the atom evolves spontaneously without any environment and the quantities associated with probabilities emerge spontaneously from nothing but time evolution of the state.

Although the time evolution itself is deterministic, the probability amplitudes evolve. Which means the probabilities are predictable. That's how nature works. Probability amplitudes emerge and evolve deterministically. There is no contradiction there.
 
  • #18
PeroK said:
If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.
You can--by creating an environment where the quantum states of the decay products are already occupied, or partially occupied. (And for the case of electron capture, you can change the half-life of the reaction by ionizing or partly ionizing the atom, so the electron availability for capture is changed.)
 
  • Informative
Likes gentzen and PeroK
  • #19
PeterDonis said:
This is a faulty analogy. The coins are tossed one at a time, and, as you say, the initial conditions can vary for each toss, so a deterministic model can still give different results for each toss.

But you can prepare a sample of a huge number of identical radioactive atoms all at the same time, yet they don't all decay at the same time. That can't be explained by variation in initial conditions, because there is none, by construction.
But they are not exactly identical in the sense that they are still different objects, otherwise we wouldn't say that there are multiple atoms. Each atom is still separated from each other, which is why they have different decay times.

The point of an analogy is not to model the exact process, otherwise it wouldn't be an analogy. The point is to show that there could simply be a process that determines the exact decay time for each atom based upon the details of the atom or the specific environment that it is in that we do not understand. Ideally, this would be shown by a more complete theory in the future
 
  • Skeptical
Likes weirdoguy
  • #20
PeterDonis said:
You can--by creating an environment where the quantum states of the decay products are already occupied, or partially occupied. (And for the case of electron capture, you can change the half-life of the reaction by ionizing or partly ionizing the atom, so the electron availability for capture is changed.)
It's true, those ideas complicate the picture. But, even in that case, the probabilities can be understood through the evolution of the state, and don't need an additional, unspecified process to produce the (expected) probabilities.
 
  • #21
syed said:
they are not exactly identical in the sense that they are still different objects
They are identical in the precise sense of that term in quantum mechanics.

syed said:
Each atom is still separated from each other, which is why they have different decay times.
What is your basis for this claim? It has no basis whatever in standard QM.
 
  • #22
syed said:
The point is to show that there could simply be a process that determines the exact decay time for each atom based upon the details of the atom or the specific environment that it is in that we do not understand. Ideally, this would be shown by a more complete theory in the future
But unless and until someone finds such a theory and makes successful predictions from it, it's vaporware and we can't discuss it here. This subforum is for discussing interpretations of QM, not hypothetical new theories that don't even exist.
 
  • #23
PeroK said:
even in that case, the probabilities can be understood through the evolution of the state
Yes, but now it's the state of atom plus environment, not just the state of the atom.
 
  • #24
To change the subject slightly. I remember learning about neutrino oscillations and thinking what a beautiful, simple model. The neutrino state evolves independent of any detector. The free neutrino is not an eigenstate of flavour, so the probability of detecting a given flavour varies with time (independent of the detector).

To reject that as a sufficient explanation for probabilities seems unintuitive to me. But also by rejecting the fundamental probability amplitudes associated with QM potentially makes it difficult to make progress in particle physics.

This is the obvious criticism of BM. Instead of accepting nature as fundamentally probabilistic, it gets bogged down in an unnecessary additional layer of detail that inhibits further progress.
 
  • Like
Likes ojitojuntos and martinbn
  • #25
@PeroK

1. In the paper https://arxiv.org/abs/2010.07575 Sec. 4.3 I have explained why the decay does not depend on details of the environment, creating an illusion that the environment is not needed for the decay, and yet why there is no decay without the environment.

2. You said yourself that isolated atom evolves deterministically. Then how can you think that environment is not needed for the non-deterministic decay?

3. The analogy with coin tossing is useful again. From the symmetric shape of the coin itself we can conclude that p(head)=p(tail)=1/2. We don't need any other variables describing the environment or position and velocity of the coin. And yet, to understand why the outcome of coin tossing appears random we need exactly that. We need the idea that the outcome depends on many fine details that we cannot know in practice to explain the apparent randomness, even though we don't need any of this to determine the probabilities of specific outcomes.
 
  • Like
Likes syed
  • #26
PeterDonis said:
They are identical in the precise sense of that term in quantum mechanics.


What is your basis for this claim? It has no basis whatever in standard QM.
What is the basis for the claim that if there was a deterministic theory, the atoms would not have different decay times? Why didn't you ask the claim maker that I was responding to the same question that you just asked me?

If someone makes a speculative claim without basis, it is not inappropriate to discuss the possibility (note: not claim) of an alternative
 
  • Sad
Likes weirdoguy
  • #27
PeroK said:
If it was uncertainty in the environment, then you could create an environment where the atom had a different half life. Where the environment could override the probability amplitudes of the state.
And this is exactly what you can do, by the quantum Zeno effect.
 
  • #28
PeroK said:
This is the obvious criticism of BM. Instead of accepting nature as fundamentally probabilistic, it gets bogged down in an unnecessary additional layer of detail that inhibits further progress.
But that's how the whole of physics, and indeed, the whole of science, is organized, into layers. Just one example: Fluids are made of atoms and atomic physicists explore it. But fluid mechanics and its applications in airplane industry does not depend on this atomic structure, they model fluids as continuous and worrying about the atomic structure would slow down the progress in fluid mechanics. Nobody says that fluid mechanists should worry about the atomic structure. But that doesn't mean that fluid mechanists should deny that atoms exist. Likewise, a particle physicist does not need to worry about quantum foundations, that would slow him down, but that doesn't mean that a particle physicist should deny that there are unsolved problems in quantum foundations which some physicists explore.
 
  • #29
My obligatory defense of consistent histories
Demystifier said:
A truly fundamentally non-deterministic formulation of QM should talk about non-determinism without referring to measurements. There are such formulations, e.g. objective collapse theories (such as GRW), consistent histories interpretation and Nelson stochastic interpretation, but they are not standard formulations of QM.
The consistent histories formalism is nonstandard in the sense that the construction of history operators is somewhat novel. But all the underlying machinery is standard QM (Hilbert spaces, projectors, density operators etc).
Consistent histories interpretation replaces the dependence on the measurement with the dependence on the framework; very roughly it says that reality exists even if we don't measure it, but this reality depends on our arbitrary choice how we decide to think of it. (For an analogy, that would be like interpretation of electromagnetism claiming that gauge potentials are real, but that this reality is a matter of our arbitrary choice of the gauge condition.).
A choice of framework is a choice of a description of reality, and different descriptions are appropriate for different purposes. But reality isn't contingent on a physicist's choice of description.
 
  • #30
Morbert said:
A choice of framework is a choice of a description of reality, and different descriptions are appropriate for different purposes. But reality isn't contingent on a physicist's choice of description.
Sometimes I think of consistent histories not as one interpretation, but as an infinite class of interpretations, where each framework corresponds to one interpretation. Since all interpretations are compatible with measurable facts, one is free to choose any interpretation (framework) one likes. Does it make sense to you?
 
  • #31
Demystifier said:
But that's how the whole of physics, and indeed, the whole of science, is organized, into layers. Just one example: Fluids are made of atoms and atomic physicists explore it. But fluid mechanics and its applications in airplane industry does not depend on this atomic structure, they model fluids as continuous and worrying about the atomic structure would slow down the progress in fluid mechanics. Nobody says that fluid mechanists should worry about the atomic structure. But that doesn't mean that fluid mechanists should deny that atoms exist. Likewise, a particle physicist does not need to worry about quantum foundations, that would slow him down, but that doesn't mean that a particle physicist should deny that there are unsolved problems in quantum foundations which some physicists explore.
I understand your point. Let's assume you are correct.

The detector has trillions of possible states, but these neatly divide into two approximately equal numbers of states. Half that inevitably detect decay and half that do not.

However, if we wait two half lives, then somehow the detector's states split 75-25. And so on. Somehow these random detector states are attuned to the probability amplitudes of the atom?

I would imagine that macroscopic devices would suffer from problems in manifesting the simple Born rule as per the state of the atom.

Once you have the state of the detector being relevant, why do we get the Born rule based only on amplitudes of the atom? Why does your average detector not screw up these probabilities? That an average detector would have some influence on the probabilities?

The detector would have to be a curious mixture of a huge number of random degrees of freedom, which jointly with the simple state of atom determine a definite result. Yet, the probability that the state of the detector is such that decay is observed is tuned perfectly to the simple state of the atom.
 
  • #32
I'm not saying it's impossible. But, if the state of the detector is irrelevant, then the Born rule makes sense. Only the state of the atom determines the probabilities.

Rather than the probability that the detector is in a certain state being aligned with the probability amplitudes in the state of the atom.

PS the special cases, like the quantum Zeno effect, are cases where you do tune the environment deliberately to affect the outcome.
 
Last edited:
  • #33
PeroK said:
Once you have the state of the detector being relevant, why do we get the Born rule based only on amplitudes of the atom? Why does your average detector not screw up these probabilities? That an average detector would have some influence on the probabilities?
I have addressed this in post #25. In item 1. I gave a reference where your question is answered in more detail and in item 3. I gave an analogy with coin tossing.
 
  • Informative
Likes PeroK
  • #34
PeroK said:
Only the state of the atom determines the probabilities.
One should distinguish determination of probability from explanation of randomness. In pure math, the Kolmogorov axioms say how to determine probability without any reference to randomness. More physically, as I explained in post #25 item 3., coin tossing is a very simple example where probability is determined by intrinsic properties of the coin itself, while the explanation of randomness depends also on the existence of the environment.
 
  • #35
Demystifier said:
I have addressed this in post #25. In item 1. I gave a reference where your question is answered in more detail and in item 3. I gave an analogy with coin tossing.
Okay. Let me read your paper. I'm just up the road from you, climbing mountains in Slovenia for the past couple of weeks.
 
  • Like
Likes Motore and Demystifier
  • #36
syed said:
What is the basis for the claim that if there was a deterministic theory, the atoms would not have different decay times?
Did you read the quote from your own post that my question was about? That's the claim I'm asking for a basis for. It was a claim you made, not someone else.
 
  • #37
PeterDonis said:
Did you read the quote from your own post that my question was about? That's the claim I'm asking for a basis for. It was a claim you made, not someone else.
Well you misinterpreted my "claim" since I wasn't making any novel claim. I was making the more trivial point that there is no such thing as two completely identical atoms, or really two completely identical objects, since the very fact that there are two of them means that there is something that is distinguishing the two objects. The very fact that they have different decay times means they are not the same exact object.

And my more general point was the inconsistency of your questions. The other poster made a claim that if quantum physics was completely deterministic, the atoms would not have different decay times. You did not ask the poster to substantiate this claim
 
  • Skeptical
  • Sad
Likes Motore, weirdoguy and PeroK
  • #38
syed said:
I was making the more trivial point that there is no such thing as two completely identical atoms
Standard QM disagrees with you. Look up the standard QM definition of indistinguishable particles.

You aren't even responding to the question I raised in the last part of post #21.

syed said:
The other poster made a claim that if quantum physics was completely deterministic, the atoms would not have different decay times. You did not ask the poster to substantiate this claim
His claim is obvious once you include the crucial fact that all of the atoms are prepared by the same process at the same time. In standard QM, this is reflected in the fact that a single quantum state is prepared at one time that includes all of the atoms, and this state evolves deterministically as long as no measurement is made.

Your claim that I quoted and asked about in post #21, on the other hand, has no basis whatever in standard QM that I can see, as I said in that post. That's why I asked you what your basis for that claim is. Nothing you have posted so far addresses that question.
 
  • Like
Likes ojitojuntos
  • #39
Demystifier said:
coin tossing is a very simple example where probability is determined by intrinsic properties of the coin itself
I don't think this is true. I believe robotic machines have been developed which can flip coins such that one side is more likely to land up than the other. So at the very least, the process by which the coin is flipped has something to do with the probability. And that process is not intrinsic to the coin.
 
  • Like
Likes PeroK and ojitojuntos
  • #40
PeterDonis said:
Standard QM disagrees with you. Look up the standard QM definition of indistinguishable particles.

You aren't even responding to the question I raised in the last part of post #21.


His claim is obvious once you include the crucial fact that all of the atoms are prepared by the same process at the same time. In standard QM, this is reflected in the fact that a single quantum state is prepared at one time that includes all of the atoms, and this state evolves deterministically as long as no measurement is made.

Your claim that I quoted and asked about in post #21, on the other hand, has no basis whatever in standard QM that I can see, as I said in that post. That's why I asked you what your basis for that claim is. Nothing you have posted so far addresses that question.
If the claim is obvious, why haven't all physicists accepted it? Can you show me anywhere in standard physics that says if QM was inherently deterministic, the decay times would not be different? If this was true, determinism in QM would be completely ruled out. So you are merely speculating and nothing else (after ironically complaining about me speculating).

And no, standard QM does not disagree with the notion that two atoms, even if identical in structure, are still different objects in the sense that they are two (i.e. distinct from each other). This should be fairly trivial
 
  • Skeptical
  • Sad
Likes Motore, weirdoguy and PeroK
  • #41
Demystifier said:
3. The analogy with coin tossing is useful again. From the symmetric shape of the coin itself we can conclude that p(head)=p(tail)=1/2.
But that's only true before the coin is tossed, right?

Before tossing an unbiased coin the probability is 1/2. When we toss it, that probability changes. The environment during the toss continues to change that probability. Finally, the surface where the coin lands also changes the probability completely.

The coin toss is an example where information exists that determines probability and outcomes, not like QM. Even if UP exists and QM is probabilistic, the evolution of probability and state in QM is deterministic.
 
  • #42
Thread closed for moderation.
 
  • Like
Likes weirdoguy and ojitojuntos
Back
Top