Question on the probabilistic nature of QM

  • Thread starter Wormaldson
  • Start date
  • Tags
    Nature Qm
In summary, the conversation discusses the concept of "genuine randomness" in quantum mechanics and the difficulty in reconciling it with classical notions of causality and determinism. The speakers also mention hidden variable theories and Bell's theorem, which prove that certain deterministic versions of quantum mechanics are mathematically impossible. They also question the use of terms like "fundamental randomness" or "fundamental determinism" and argue that these are not scientifically testable concepts.
  • #36


bhobba said:
I mean its fundamentally a probabilistic theory and not some kind of deterministic process masquerading as such, as say Bohmian Mechanics does. I firmly believe it is a fundamental probabilistic theory - I reject completely Einstein's idea it was incomplete - I have zero problem with God playing dice (although that of course was not Einstein's main objection - he was more concerned with an objective reality independent of observation - but as always his reasoning was subtle). I am simply pointing out as a matter of principle QM may be the limit or approximation or whatever of some deterministic process and there is no way it can be ruled out. I find it slightly puzzling why anyone would doubt it.

Thanks
Bill

It is only your last statement I am commenting on, as the rest I agree with pretty well.

You might acknowledge that after 80+ years, there has not been the slightest bit of evidence - nor any plausible hypothesis other than perhaps Bohmian class theories - that any underlying deterministic mechanism exists in nature. In that light, I wouldn't find it surprising to doubt it exists. I doubt it, for instance.

So yes, certainly it is possible, no issue there. On the other hand, newer ideas such as the PBR theorem cast significant doubt that there can be a deterministic solution. If the quantum state is fundamental, then there is no determining factor to uncover.
 
Physics news on Phys.org
  • #37


bhobba said:
Describe to me the test that will prove 100% for sure it was not created by a deterministic process that pass such tests.
I would say you misunderstood what I asked.
In simple words - you provide hypothetical example that demonstrates non-deterministic randomness and I try to provide test that should demonstrate that it is deterministic (according to our view of physical laws).

And please take into the account that this example should supposedly work as explanation for genuine randomness of QM i.e. I ask this question in context of OP:
Wormaldson said:
Problem is, I can't think of any classical situations in which this notion of genuine randomness actually applies.
 
  • #38


Wormaldson said:
So, finally, the question(s): a good place to start would certainly be, am I just interpreting the information wrong? Do we know for sure that quantum mechanics obeys this genuine-randomness-dependent behaviour? If not, then what do we suppose determines the behaviour of quantum mechanical phenomena? If so, then how is it that the behaviour is determined without a cause?

As always, any insight would be much appreciated. This has me quite puzzled.

Quantum mechanics--at least the most common type students learn--is statistically determinant, not random.

Do we have deterministic objects following statistical laws or statistical objects following deterministic laws? Does atomic structure quantize energy, or does energy quantize atomic structure? It doesn't matter which is which; any event would involve both aspects, so there is no difference.
Wormaldson said:
Problem is, I can't think of any classical situations in which this notion of genuine randomness actually applies.

Randomness is relative. In "classical situations", it's usually so small or uniform that we don't care about it, even though it's there. If the system is sensitive enough, however, we would notice. Of course, it doesn't have to be random in a uniform way, in which case again we might find patterns that appear "deterministic", along with some degree of accompanying "randomness".
 
Last edited:
  • #39


DrChinese said:
It is only your last statement I am commenting on, as the rest I agree with pretty well.

You might acknowledge that after 80+ years, there has not been the slightest bit of evidence - nor any plausible hypothesis other than perhaps Bohmian class theories - that any underlying deterministic mechanism exists in nature. In that light, I wouldn't find it surprising to doubt it exists. I doubt it, for instance.

So yes, certainly it is possible, no issue there. On the other hand, newer ideas such as the PBR theorem cast significant doubt that there can be a deterministic solution. If the quantum state is fundamental, then there is no determining factor to uncover.

I acknowledge and agree with everything you say. I am speaking of a matter of principle - not what I believe. IMHO standard QM is correct - BM etc and other outs are a crock.

Thanks
Bill
 
  • #40


zonde said:
I would say you misunderstood what I asked.
In simple words - you provide hypothetical example that demonstrates non-deterministic randomness and I try to provide test that should demonstrate that it is deterministic (according to our view of physical laws).

And please take into the account that this example should supposedly work as explanation for genuine randomness of QM i.e. I ask this question in context of OP:

Please be 100% clear what I am saying. I will repeat it again. I am saying there is no way by any test currently available you can tell a random sequence from one created by a well designed deterministic algorithm. If you want specifics let's say it was created by the Mersenne Twister algorithm. I give you such a sequence and you are required to tell me how you would determine if it is genuinely random or made by the twister.

The last part of your requirement - namely - 'should supposedly work as explanation for genuine randomness of QM' - is trivial because you can simply postulate that some unknown process at the sub quantum level mimics that algorithm. Is such - likely - hell no - it would be a totally silly and laughable hypothesis - but again this is a matter of principle - not of reasonableness. Reason, Occam's Razor, all sorts of stuff tells me QM is genuinely random.

If you think the above is outlandish and physically unreasonable you are correct. If that is your concern about what I am saying then let's pin it down to something more physically reasonable. I give you the results of a double slit experiment - namely the positions of the detected particles. Tell me how you would tell the difference between it being genuinely random and what is predicted by BM which is deterministic but the randomness is a result of factors not under control of the experimenter but that are presumably in principle knowable?

Thanks
Bill
 
Last edited:
  • #41


Quantum randomness is tied strongly with wavefunction collapse. It is in the same can of worms. "No-collapse" interpretations such as MWI or BM are automatically deterministic. The "appearance of collapse FAPP" naturally translates into "appearance of randomness FAPP" (where the "apparent randomness FAPP" is indistinguishable from "genuine randomness" by any experimental test, a notion I can comfortable live with). True 'genuine randomness' is equivalent to objective collapse. "Consciousness causes collapse" is translated into "consciousness is the source of randomness" etc. So by making a statement about the nature of randomness one implicitly adopts or rejects particular interpretation. Choose your poison.

Personally I don't see what the fuss is about. We know that quantum randomness only appears during measurement process. We also know that this process necessarily involves interaction of one microscopic system being measured with huge number of interacting microscopic systems making up measuring apparatus and its environment. It is only natural to expect that the initial state of the apparatus and/or the environment influences the outcome. Since we do not know the initial state (and cannot possibly know it all even if we tried, due to no-cloning theorem), it should be no surprise that the outcome appears random.
 
  • #42


bhobba said:
Please be 100% clear what I am saying. I will repeat it again. I am saying there is no way by any test currently available you can tell a random sequence from one created by a well designed deterministic algorithm. If you want specifics let's say it was created by the Mersenne Twister algorithm. I give you such a sequence and you are required to tell me how you would determine if it is genuinely random or made by the twister.
Well that's trivial - take the algorithm, take the same seed and you get the same result.
This of course is not genuine randomness as we can clearly identify cause, it's the seed. And with the same seed (the same cause) algorithm is always going to give the same result i.e. no randomness.
 
  • #43


Delta Kilo said:
It is only natural to expect that the initial state of the apparatus and/or the environment influences the outcome. Since we do not know the initial state (and cannot possibly know it all even if we tried, due to no-cloning theorem), it should be no surprise that the outcome appears random.

Indeed, that's very natural, and I guess nobody has a problem with this. However, the resulting randomness breaks the linearity of the evolution. There is no way a linear evolution can create outcomes that depend on the magnitude of components.

So the problem is quite a bit deeper than just identifying a source of randomness. You have to explain the nonlinearity of the observation and the exact distribution of the random outcomes.
 
  • #44


zonde said:
Scientific method (testing in particular) is based on concept of causation. As a result anything that can't be interpreted from perspective of causation is non-scientific.
This is sucn an interesting and important issue that it probably calls for its own thread, but I'll just answer briefly that it is highly debatable that the concept of cause is at all important in physics. I would go so far as to argue that the concept of a cause is not even definable in physics, the definition appears more at the level of human interaction with our environment, which is well separated from the laws themselves.

One simple reason for this is the tendency for the laws of physics to be time-reversible. One key ramification of this is that "what causes what" is very much a kind of sociological construct, that has a lot more to do with what we use science for that it has to do with the laws of physics. So I would agree that "causation is important in science", but that's because human interaction with, and involvement in, our environment is indeed important in science. Science is a human endeavor. But the laws can still be expressed in language that is completely devoid of "causes", and the laws are still the same laws-- it is just a popular way of interpreting the laws because it gibes well with what we use science to do.
 
  • #45


ThomasT said:
But the fact that many macroscopic processes on many scales are trackable and in accordance with deterministic laws would seem to indicate that the underlying processes are also deterministic (ie., lawful). Unless there's some reason to believe that the reality underlying instrumental behavior is essentially different from the macroscopic reality of our senses, and that our ignorance thereof is not just a matter of the limitations of our sensory capabilities.
But there is a very good reason to believe that-- it is almost inevitablly true! Why on Earth would our senses, which are presumably derived from a huge amalgamation of microscopic processes that we are trying to understand, not be essentially different from those processes? Are not the actions of an ant colony essentially different from what an individual ant is doing? Is not what a violinist is doing essentially different from what the particles in a violin are doing? I disagree with the implication that the default assumption is that our way of thinking about and interacting with reality should be the same as what reality is "actually doing", it seems clear to me that the default assumption should be that we are filtering reality to get it to serve our needs, needs that are extremely dependent on what humans are and what we want to do.

So when our filters give us results that allow deterministic interpretations of macro phenomena, we should always expect that to be emergent behavior, just as we expect the way a fluid flows through a nozzle to be emergent from what the atoms are actually doing, and what atoms are actually doing to be emergent from what quarks and fields are doing, and so on ad infinitum (and I say this without necessarily committing to the idea that the universe is built entirely bottom-up). We don't get to know what it is "emergent" from, because even that could also be emergent. We just have to recast what it is we are trying to know about reality.

The key point is that we have no difficultly interpreting seemingly deterministic behavior as emergent from random behavior, that's pretty much the field of statistical mechanics. Also, we have no difficulty interpreting seemingly random behavior as emergent from deterministic behavior, that is what Delta Kilo described so succinctly. These are all just interpretations, but we can't "reason by interpretation." Reality is just not going to give up these secrets, all we can do is make good models and interpret them however it works for us. Sometimes that leads to a consensus interpretation, sometimes it doesn't, but reality is not beholden to our interpretations, any more than you are limited to be what your dog thinks you are.
 
Last edited:
  • #46


Ken G said:
But there is a very good reason to believe that-- it is almost inevitablly true! Why on Earth would our senses, which are presumably derived from a huge amalgamation of microscopic processes that we are trying to understand, not be essentially different from those processes? Are not the actions of an ant colony essentially different from what an individual ant is doing? Is not what a violinist is doing essentially different from what the particles in a violin are doing?
It depends on what one is referring to by "essentially". In the context of this thread, I'm supposing that "essentially different" refers to lawful vs nonlawful (ie., deterministic vs nondeterministic) processes or evolutions. Ants, ant colonies, violins, violinists, orchestras, and everything else I can think of, all seem to evolve deterministically.

Beyond that, quantum experimental phenomena, and the theories and models associated with them, seem to me to indicate that the underlying physical world is composed of a vast hierarchy of particulate media. Since I can characterize the macroscopic world of my sensory experience in that way also, and since our sensory machinery is, afaik, vibratory ( that is, we detect frequencies wrt various media), and since there are so many examples of strikingly similar phenomena on so many different scales, then it seems logical to me to suppose that any and all behavior at any and all scales has a common ancestor or fundamental dynamical law(s) governing everything.

Ken G said:
... when our filters give us results that allow deterministic interpretations of macro phenomena, we should always expect that to be emergent behavior ...
I agree, and the notion of encompassing fundamental laws (ie., a fundamentally deterministic universe) is compatible with emergence.

Ken G said:
... I say this without necessarily committing to the idea that the universe is built entirely bottom-up ...
If you mean from small to large, then I agree. But the bottom, ie., the most fundamental, might also refer to behavioral principles or dynamical laws.

Ken G said:
The key point is that we have no difficultly interpreting seemingly deterministic behavior as emergent from random behavior, that's pretty much the field of statistical mechanics. Also, we have no difficulty interpreting seemingly random behavior as emergent from deterministic behavior, that is what Delta Kilo described so succinctly.
Yes, that seems to be the case.

Ken G said:
These are all just interpretations, but we can't "reason by interpretation." Reality is just not going to give up these secrets, all we can do is make good models and interpret them however it works for us.
So, aren't we reasoning, regarding the nature of reality, via interpretation?
 
  • #47


Ken G said:
This is sucn an interesting and important issue that it probably calls for its own thread, but I'll just answer briefly that it is highly debatable that the concept of cause is at all important in physics. I would go so far as to argue that the concept of a cause is not even definable in physics, the definition appears more at the level of human interaction with our environment, which is well separated from the laws themselves.
Basic concepts are not definable. Are you familiar with axiomatic systems and what are undefined terms in them?

Ken G said:
One simple reason for this is the tendency for the laws of physics to be time-reversible.
This is because we use math for formulation of laws a lot. Math works when quantities are conserved. When quantities are not conserved we combine different quantities so that combination is conserved. This is the bias introduced by extensive usage of math.

Ken G said:
One key ramification of this is that "what causes what" is very much a kind of sociological construct, that has a lot more to do with what we use science for that it has to do with the laws of physics. So I would agree that "causation is important in science", but that's because human interaction with, and involvement in, our environment is indeed important in science. Science is a human endeavor.
Any experimental test starts with things that we can do (cause) then from this point we can go further. So it's not just important it's the basis of science.

Ken G said:
But the laws can still be expressed in language that is completely devoid of "causes", and the laws are still the same laws-- it is just a popular way of interpreting the laws because it gibes well with what we use science to do.
Some simple example, please.
 
  • #48


Jazzdude said:
Indeed, that's very natural, and I guess nobody has a problem with this. However, the resulting randomness breaks the linearity of the evolution. There is no way a linear evolution can create outcomes that depend on the magnitude of components.

So the problem is quite a bit deeper than just identifying a source of randomness. You have to explain the nonlinearity of the observation and the exact distribution of the random outcomes.
I would agree that the problem is a bit deeper.
I would say that it's certain lack of randomness that is puzzling when we speak about interference rather than excess randomness. And it's similar with entanglement.
 
  • #49


ThomasT said:
It depends on what one is referring to by "essentially". In the context of this thread, I'm supposing that "essentially different" refers to lawful vs nonlawful (ie., deterministic vs nondeterministic) processes or evolutions. Ants, ant colonies, violins, violinists, orchestras, and everything else I can think of, all seem to evolve deterministically.
I see you are not fan of "systems" thinking, but rather are a strict reductionist? For myself, I see a lot of value in the "systems" viewpoint (that the action of complex systems is best understood as an interplay between top-down coupling constraints and bottom-up independent processes, than it is with a purely reductionist approach that the whole is understood purely by considering the elementary parts). But more to the point, I would certainly not say that what an orchestra is doing is strictly deterministic! It certainly cannot be demonstrated in detail to be deterministic, nor precisely predicted as a deterministic process, so the issue must boil down to whichever one views as the "default" assumption. I think many physicists are way too quick to picture determinism as the default, there really isn't any solid reasons to adopt that stance-- it's simple overinterpretation, in my view.
Beyond that, quantum experimental phenomena, and the theories and models associated with them, seem to me to indicate that the underlying physical world is composed of a vast hierarchy of particulate media.
But what do we mean "composed of"? Strictly composed of that? There's no question the particulate model is vastly important and successful, but so is the fields model, so at the very least we might wish to say the physical world is composed of particles and fields. But I wouldn't even say that-- I would just say our models invoke particles and fields, and what the "underlying physical world" is composed of is simply not a concept that physics needs, and we never get to know that, not even using physics.

Since I can characterize the macroscopic world of my sensory experience in that way also, and since our sensory machinery is, afaik, vibratory ( that is, we detect frequencies wrt various media), and since there are so many examples of strikingly similar phenomena on so many different scales, then it seems logical to me to suppose that any and all behavior at any and all scales has a common ancestor or fundamental dynamical law(s) governing everything.
Yes, the rationalistic view that laws "govern" reality, rather than reality "governs" what we will interpret as laws. That debate has raged as long as there has been thought about our environment, let me just say that an extremely unlikely proposition, and it has never stood the test of time, a fact we all too easily overlook.
I agree, and the notion of encompassing fundamental laws (ie., a fundamentally deterministic universe) is compatible with emergence.
Not really-- not unless you think that some phenomena emerge and other, more fundamental ones, don't. But if you hold, as I do, that all phenomena are emergent, and that there is never going to be any such thing as a fundamental process (nor does there need to be to do physics exactly as we do it), then the notion of encompassing fundamental laws is not compatible with emergence, because even the laws must emerge from something else (given that no law deals in the currency of something fundamental, but rather only in emergent phenomena). It seems a more natural "default" assumption, being the only one that actually has stood the test of time!

If you mean from small to large, then I agree.
I do, the common idea is that large phenomena emerge from small phenomena. But I'm not claiming that to be true, I think emergence can also cascade from large to small (as in the case of a violinist manipulating the instrument in a way that ultimately affects its atoms). But it is no longer important to specify what emerges from what if there is nothing fundamental that is "at the bottom" anyway.
So, aren't we reasoning, regarding the nature of reality, via interpretation?
I would argue no-- not if we are being precise about what we are doing. When we get a little casual about expressing what physics does, we often frame it as reasoning about the nature of reality, but Bohr had it right-- physics is what we can say about nature. I believe he meant that this means physics is not about nature herself, it is about our interaction with nature. We can interpret what we are doing around our interaction with nature, because we need to interpret our goals and objectives, but we are not interpreting the "nature of reality"-- as soon as you interpret that, it ain't the nature of reality any more.
 
  • #50


zonde said:
Basic concepts are not definable. Are you familiar with axiomatic systems and what are undefined terms in them?
If you hold that a "cause" is an axiom in physics, please specify a theory, any theory, that requires that in its axiomatic structure. I'm not aware of any, causes are sociological constructs we add on top of our theories to help us interpret them, no laws of physics refer to causes that I've ever heard of. This is clear from the simple fact that you would need to immediately remove from consideration any laws that are time reversible, so gone are Newton's laws, the Schroedinger equation, and general relativity.
Any experimental test starts with things that we can do (cause) then from this point we can go further. So it's not just important it's the basis of science.
No, you don't need to imagine you are causing something to do a scientific experiment. That we often do that is indeed our sociology, but it's not a requirement. If I drop a mass in my experiment, I never need to imagine that I "caused the mass to fall", or that gravity did, I am just setting up an experiment and watching what happens. No causation necessary, indeed causation brings in significant philosophical difficulties (around free will and so on). But I agree that we do invoke causation concepts constantly when we do science, and that's because science is a human endeavor, and humans use causation concepts in our daily lives all the time-- it's part of our sociology.
Some simple example, please.
Give me any phenomenon of your choosing that you feel must be described in terms of causes and effects, and I will offer a perfectly successful way to describe that same phenomenon without invoking those concepts at all.
 
  • #51


Ken G said:
I see you are not fan of "systems" thinking ...
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.

Ken G said:
... but rather are a strict reductionist?
Only in the behavioral (ie., wrt dynamical law) sense. Not wrt scales of size.

Ken G said:
For myself, I see a lot of value in the "systems" viewpoint (that the action of complex systems is best understood as an interplay between top-down coupling constraints and bottom-up independent processes, than it is with a purely reductionist approach that the whole is understood purely by considering the elementary parts).
I agree. Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.

Ken G said:
But more to the point, I would certainly not say that what an orchestra is doing is strictly deterministic!
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.

Ken G said:
It certainly cannot be demonstrated in detail to be deterministic, nor precisely predicted as a deterministic process, so the issue must boil down to whichever one views as the "default" assumption.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).

Ken G said:
I think many physicists are way too quick to picture determinism as the default, there really isn't any solid reasons to adopt that stance-- it's simple overinterpretation, in my view.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?

Ken G said:
But what do we mean "composed of"? Strictly composed of that?
Yes. Media, at any scale, which can be analysed in terms of their particular particulate constituents, but disturbances in which seem to be governed by fundamental dynamical law(s).

Ken G said:
There's no question the particulate model is vastly important and successful, but so is the fields model, so at the very least we might wish to say the physical world is composed of particles and fields.
Fields are just groupings of particles endowed with certain properties. Physical science hasn't yet gotten to explaining things in terms of, or positing, fundamental dynamical law(s).

Ken G said:
... I would just say our models invoke particles and fields, and what the "underlying physical world" is composed of is simply not a concept that physics needs, and we never get to know that, not even using physics.
I think that certain things can be inferred from the extant physics, and that as the field of instrumentation and detection advances, then even more will be able to be inferred about the reality underlying instrumental behavior.

Ken G said:
Yes, the rationalistic view that laws "govern" reality, rather than reality "governs" what we will interpret as laws. That debate has raged as long as there has been thought about our environment, let me just say that an extremely unlikely proposition, and it has never stood the test of time, a fact we all too easily overlook.
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?

Ken G said:
Not really-- not unless you think that some phenomena emerge and other, more fundamental ones, don't. But if you hold, as I do, that all phenomena are emergent, and that there is never going to be any such thing as a fundamental process (nor does there need to be to do physics exactly as we do it), then the notion of encompassing fundamental laws is not compatible with emergence, because even the laws must emerge from something else (given that no law deals in the currency of something fundamental, but rather only in emergent phenomena). It seems a more natural "default" assumption, being the only one that actually has stood the test of time!
This doesn't make any sense to me. I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.

Ken G said:
... the common idea is that large phenomena emerge from small phenomena. But I'm not claiming that to be true, I think emergence can also cascade from large to small (as in the case of a violinist manipulating the instrument in a way that ultimately affects its atoms). But it is no longer important to specify what emerges from what if there is nothing fundamental that is "at the bottom" anyway.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.

Ken G said:
I would argue no-- not if we are being precise about what we are doing. When we get a little casual about expressing what physics does, we often frame it as reasoning about the nature of reality, but Bohr had it right-- physics is what we can say about nature. I believe he meant that this means physics is not about nature herself, it is about our interaction with nature. We can interpret what we are doing around our interaction with nature, because we need to interpret our goals and objectives, but we are not interpreting the "nature of reality"-- as soon as you interpret that, it ain't the nature of reality any more.
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more. And of course, it's all subject to interpretation.
 
Last edited:
  • #52


zonde said:
Yes, cause is part of interpretation.


Let's say I do not believe you that it is possible, namely that physical phenomenon can be accurately predicted without concept of causation.

Scientific method (testing in particular) is based on concept of causation. As a result anything that can't be interpreted from perspective of causation is non-scientific.

i agree.
 
  • #53


ThomasT said:
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
 
  • #54


lugita15 said:
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
 
Last edited:
  • #55


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"

Yes, I completely agree. All the different interpretations of QM easily accommodate decoherence, yet their basic differences remain, as does their very different ways of dealing with the measurement problem.
 
  • #56


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf



-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.
 
  • #57


lugita15 said:
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.

Interesting paper.

And indeed there is disagreement on if decoherence solves the measurement problem. Most people (including me) seem to think it doesn't - what it does however is give the appearance of wave function collapse so for all practical purposes resolves the issue - but in a different way than the collapse problem was formulated. IMHO is removes the central mystery of the superposition principle in how a system can be partly in one state and partly in another so the normal rules of logic are cock-eyed and replaces it with a simple probability of being in one state or the other - but definitely in some state - not in this weird superposition.

Thanks
Bill
 
  • #58


ThomasT said:
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.
But that just isn't systems thinking. Systems thinking is that you can't understand systems adequately if all you use is bottom-up dynamical laws. If they thought you could, they wouldn't need systems thinking. The idea is that you cannot understand the interaction between top-down constraints and bottom-up dynamical laws if all you have is bottom-up dynamical laws, from which it follows that the universe cannot be "run" purely with bottom-up dynamical laws (even if you are inclined to imagine that the universe is "run" by any kind of mathematical structure).
Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.
The problem is, there is no way to parse that claim from the more simple statement "ontologies used to interpret and apply physics are based on dynamical laws/constraints." This is simply a statement of what defines physics, there is no need whatsoever to graduate it to a claim on the existence of anything. Indeed, the history of physics is quite clear that we do not need things to actually exist in order to use them quite effectively in physics (a glaring example being Newton's force of gravity, which is still used constantly in physics, even though its "existence" is deeply in doubt).
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.
I'm just going to let those words sit for awhile. Could there be a more clear example of pushing a preconception down nature's throat? I see this as a very common attitude in physics, but I would like to call it into question: the idea that we should regard a given attitude as true as long as we can rationalize it. This strikes me as just exactly what Popper complained about in regard to some theories of his day that were regarded as high science at the time, and which Popper felt were basically a fraud.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).
The evidence is that determinism isn't strictly true, but is a useful interpretation for making functionally successful predictions within limits. That is certainly not a preponderence of evidence that determinism is actually true at some unseen yet imagined deeper level. We have a name for that unseen deeper level: fantasy. All the same, it is in the mission statement of physics to look for effective determinism at the functional level we can actually observe, without any requirement to assume there exists some unseen deeper level where it's really true.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful).
But either of those assumptions is both unsubstantiated and unnecessary. You seem to overlook the more basic assumption: assume the universe is neither, it's just the universe. The idea that it has to be one or the other is simply mistaking the map for the territory, it's like saying we can either use a road map or a topographical map to navigate our path, so we must assume reality comprises fundamentally of either roads or mountains.
Fields are just groupings of particles endowed with certain properties.
Yet someone else can say that particles are just groupings of fields endowed with certain properties (and many do say that). There is no falsifiability in these claims, they are essentially personal philosophies. They are fine to use as devices for empowering your own approach to physics, but they are not, nor need to be, claims on what really is. This is actually a very good thing for physics, because physics would be quite impossible if it only worked if we could all agree on issues like whether particles or fields are more "fundamental." (Ask ten particle physicists to describe their own personal view of what a particle actually is, and be prepared to hear ten different answers. I know one who says "particles are a hoax".)
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?
I hear two totally different claims in the first and second part of that sentence, and an implication of an inference between them. The claim in the first part is just demonstrably how we do physics, so I have no issue with that. The claim at the end is kind of tacked on, with no necessary connection to the first part, and that is where the issue lies. There's a difference between using that second part as a philosophy behind one's own approach to the first part, versus claiming that the second part is a scientific inference from the first part. There is actually quite little evidence that the inference follows, and a host of evidence in the history of the trials and tribulations of science that it doesn't. Neither of those facts make the conclusion wrong-- they just don't make it right either. It doesn't follow.
I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.
I agree that we have no basis to say the universe is evolving fundamentally randomly, but we also have no basis to say it is evolving fundamentally deterministically. We have no basis to say it is "fundamentally" doing anything other that what we observe it to be doing. What is fundamental in physics is very much a moving target and always should be, for that is science. What is "fundamental in reality" is so impossible to define scientifically that I can't see why we even need the phrase.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.
I have no problem with you finding that reasonable. People find all kinds of things reasonable, for all kinds of personal reasons, and that is part of what you own, it is a right of having a brain. My issue is with the claim that this is somehow a logical inference based on evidence, when in fact the evidence is either absent, or to the contrary, as long as one avoids the trap of imagining that whatever is untested will still work. We need a "Murphy's law of science" (if a theory can be wrong, it will) to keep our views consistent with the actual history of this discipline!
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more.
What I wonder is, why do you think that your saying that is any different from Ptolemy saying it, or Newton? The history of physics is a history of great models that helped us understand and gain mastery over our environment, but it is not a history of our great models actually being the same as some "underlying reality." Instead, our great models have been like shadows, that fit some projection of reality but are later found to not be the reality. What I don't get it is, why do we have to keep pretending that this is not just exactly the whole point of physics?
 
Last edited:
  • #59


Ken G said:
If you hold that a "cause" is an axiom in physics, please specify a theory, any theory, that requires that in its axiomatic structure. I'm not aware of any, causes are sociological constructs we add on top of our theories to help us interpret them, no laws of physics refer to causes that I've ever heard of. This is clear from the simple fact that you would need to immediately remove from consideration any laws that are time reversible, so gone are Newton's laws, the Schroedinger equation, and general relativity.
No, I hold that "cause" is undefined term (or primitive notion) in science.
And it is used in formulation of prediction: "<this> causes <that>".

Ken G said:
No, you don't need to imagine you are causing something to do a scientific experiment. That we often do that is indeed our sociology, but it's not a requirement. If I drop a mass in my experiment, I never need to imagine that I "caused the mass to fall", or that gravity did, I am just setting up an experiment and watching what happens. No causation necessary, indeed causation brings in significant philosophical difficulties (around free will and so on). But I agree that we do invoke causation concepts constantly when we do science, and that's because science is a human endeavor, and humans use causation concepts in our daily lives all the time-- it's part of our sociology.
We imagine that we are free (our ideas are the main cause for particular design of experimental setup) to set up experiment as we want.

Ken G said:
Give me any phenomenon of your choosing that you feel must be described in terms of causes and effects, and I will offer a perfectly successful way to describe that same phenomenon without invoking those concepts at all.
x=vt or "velocity of the body causes linear change in position of the body".
 
  • #60


It is true that decoherence doesn't solve the measurement problem in that there's more work to do -- much in the same way that one can't claim the kinetic theory of gas explains the ideal gas law until you figure out how to actually quantify how pressure is an emergent property of particle interactions.But most of the objections I've seen aren't on the grounds that there's more work to do, but in that it's fundamentally missing the point, and this is where I have to disagree. The emergence of 'classical' probability distributions on relative states from unitary evolution suggests that 'absolute' definiteness is not a meaningful idea, in much the same way that Einstein's train thought experiment suggests that absolute simultaneity is not a meaningful idea.

In my estimation, the dissatisfaction with the decoherence solution to the measurement problem looks very much like a reluctance to give up the notion of absolute definiteness.

Instead, what we have is relative definiteness. Conditioned on the hypothesis that I toss a baseball upwards with a velocity v, the probability that it reaches a height of roughly [itex]v^2 / (2g)[/itex] is (nearly) 1.

This fact does not require the belief that when 'God' looks at the universe, he sees that I have definitely thrown the baseball upwards with velocity v as opposed to some mixture or superposition or ensemble or whatever of various different possibilities.

Nor to derive this fact am I required to use a mathematical model that includes me definitely tossing a baseball upwards with velocity v as opposed to, e.g., using a state smeared out across configuration space.

But the assumption of absolute definiteness would insist on both things. And the habit of assuming absolute definiteness can be difficult to break -- one becomes so accustomed to phrasing questions absolutely that it becomes difficult to weaken it to a relative question. And to be fair, prior to QM there wasn't much incentive to do so.
 
  • #61


Hurkyl said:
But most of the objections I've seen aren't on the grounds that there's more work to do, but in that it's fundamentally missing the point, and this is where I have to disagree. The emergence of 'classical' probability distributions on relative states from unitary evolution suggests that 'absolute' definiteness is not a meaningful idea, in much the same way that Einstein's train thought experiment suggests that absolute simultaneity is not a meaningful idea.

In my estimation, the dissatisfaction with the decoherence solution to the measurement problem looks very much like a reluctance to give up the notion of absolute definiteness.
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?
 
  • #62


zonde said:
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?

I think he means a reluctance to accept that the world is basically not deterministic but rather can only be described in terms of probabilities. Decoherence does not tell us how a particular result is singled out - it only gives probabilities - but it does tell us a system is in one state only - not a weird combined state such as in Schroedinger's Cat where the cat is in a weird superposition of alive and dead - rather it is either alive or dead - but all you can predict is probabilities - no mechanism is offered on how alive or dead is determined.

Personally I have no problem with this at all and believe decoherence solves the basic problem of QM - but each to his/her own.

And indeed more work needs to be done - but to me the basic message is clear - leaking of phase to the environment stops systems in general being in a superposition of states. Of course there are exceptions such as superconductivity etc - but in the vast majority of situations here in the macro world QM weirdness is hidden by decoherence.

Thanks
Bill
 
Last edited:
  • #63


zonde said:
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.

But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system
bhobba said:
being in one state or the other - but definitely in some state - not in this weird superposition.
More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution. The relative state interpretation (i.e. many worlds) studies unitary evolution directly and its effect on subsystems. Bohmian mechanics likewise keeps the indefiniteness of the wave-function, but shows its (definitely located) particles tend towards the distribution of the wave-function.

Even interpretations that aren't decoherence-based can allow for this indefiniteness. For example, Rovelli's paper on relational quantum mechanics analyzes the Wigner's Friend thought experiment and arguesto the effect that Wigner's analysis would be
My friend has opened the box and remains in an indefinite state, but one entangled with Schrödinger's cat. Their joint state collapsed to a live cat when I asked him about the results.​
and Wigner's friend's analysis would be
I opened the box and saw a live cat! I told Wigner when he asked.​
and both analyses would be equally valid. (actually, I'm not entirely sure if RQM is decoherence-based or collapse-based or agnostic about it. Really, I didn't like the paper other than this point of view on the Wigner's friend thought experiment, and don't remember the rest at all)
I liken the rejection of indefiniteness to the person who studies Newtonian mechanics but rather than setting up an inertial reference frame, instead carefully solves sets up coordinates in which the observer is always at the origin and at rest, and refuses to understand the laws of mechanics presented in any other coordinate system. After all, when he looks around, he always sees things from his perspective; working with a coordinate chart centered elsewhere would be nonphysical and meaningless!
 
Last edited:
  • #64


Hurkyl said:
More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution.

True - but they need further investigation and development. Right now I am happy with phase being leaked.

Thanks
Bill
 
  • #65


I've already said, and nobody cared (but probably making justice for my ignorance!), that to me the universe is deterministic. But not in a Bohmian way, perhaps more in a Many Worlds way, but without the split of universes.
Im going to insist with my idea because I can't see what is wrong and I really don't like the random point of view mainly because evolution equation (Schrodinger or whatever) is deterministic, so every experiment, idealized as the evolution equation applied to the system + the instrument should be deterministic. So my way to unify the deterministic property of the evolution equation with the random nature of experiments is to say that one can never know the exact state of the instrument, and that adds an apparent randomness to the final state of the system.
Gleasons theorems states, in some way, that if in a Hilbert modeled system, there is going to be made an experiment and the result is random and only depends on the initial state of the system, then the probabilities should be calculated with the born rule. In this case, I say it again, the only way to introduce randomness in the experiment is by not knowing the exact state of the instrument, but making sure that this ignorance does not make the system go deterministically to one state (because in that situation that would not be called an experiment, just an "interaction").
Im sure there are a lot of imprecisions in my argument, but I can't see any flaw. However, I have never seen this point of view in Wikipedia or similar so I don't know if it is wrong or what!

I will really be very thankful for any point of view that you can provide

Ps: In this point of view, if the experiment is just letting time go by, then the appearence of randomness in the "experiment" is, I think, usually called, decoherence.
 
  • #66


Just to add some points, my idea is that an experiment is an interaction that makes the system leave the actual state (in contrafactual definiteness language, leave its properties) and forces it to go to some random (from the point of view of the scientist, not from the one of god -of whatever- that knows the exact state of the instrument) state (in contrafactual definiteness language, make the system choose some properties that it didnt have before).
Another point: There are not many worlds. Just one, chosen by the experiment "randomness".
So this point of view is not against the deterministic nature of the evolution equation (because it is indeed deterministic). It is not against our intuition that there is only one reality and not many worlds. And not against the probabilities of the born rule, because the idea is that, due to Gleasons theorems, the ignorance of the scientist manifests in the experiments by the emergence of the Born Rule probabilities (because, if the scientist sees probabilities -even though its nature depends on ignorance and not on "real randomness"- and if he makes the experiment in a way that its probabilities depend only on the Hilbert state representation, then the probabilities have to be calculated by the Born Rule).
Sorry for my imprecisions, hope you'll be able to follow my not so clear thoughts!
 
Last edited:
  • #67


Ken G said:
[...]
The history of physics is a history of great models that helped us understand and gain mastery over our environment, but it is not a history of our great models actually being the same as some "underlying reality." Instead, our great models have been like shadows, that fit some projection of reality but are later found to not be the reality. What I don't get it is, why do we have to keep pretending that this is not just exactly the whole point of physics?
Thanks for your clearly stated posts Ken. I think I pretty much agree with your answers to the OP's problem in particular, and your approach to how best to think about physical science in general.
 
  • #68


ThomasT said:
Thanks for your clearly stated posts Ken. I think I pretty much agree with your answers to the OP's problem in particular, and your approach to how best to think about physical science in general.

Ken is a wonder all right - his clarity of thought is awe inspiring and an excellent counterpoint to guys like me that side with Penrose and believe the math is the reality in a very literal sense.

Thanks
Bill
 
Last edited:
  • #69


lugita15 said:
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.

yes, I have read previously on zurech attempts, but i think the last solution it will come from a more wider theory, a nonlinear one like trace dynamics or an epistemic ontic model..
 
  • #70


bhobba said:
... guys like me that side with Penrose and believe the math is the reality in a very literal sense.
That view is somewhat puzzling to me. Perhaps you might post in the What's Your Philosophy of Mathematics? thread?
 
<h2>1. What is the probabilistic nature of quantum mechanics?</h2><p>The probabilistic nature of quantum mechanics refers to the fact that at the quantum level, particles do not have definite properties such as position or momentum. Instead, these properties are described by a probability distribution, and the outcome of a measurement is not certain but rather determined by chance.</p><h2>2. Why is quantum mechanics considered to be probabilistic?</h2><p>Quantum mechanics is considered to be probabilistic because it is based on the concept of wave-particle duality, which states that particles can behave as both waves and particles. This means that the position and momentum of a particle cannot be precisely determined at the same time, leading to the probabilistic nature of the theory.</p><h2>3. How does the probabilistic nature of quantum mechanics differ from classical mechanics?</h2><p>In classical mechanics, the behavior of particles is deterministic, meaning that the outcome of a measurement can be predicted with certainty. However, in quantum mechanics, the behavior of particles is described by a wave function that gives the probability of finding a particle in a particular state. This fundamental difference is what makes quantum mechanics probabilistic.</p><h2>4. What is the role of uncertainty in the probabilistic nature of quantum mechanics?</h2><p>Uncertainty is a fundamental principle of quantum mechanics and is related to the probabilistic nature of the theory. The Heisenberg uncertainty principle states that it is impossible to know both the position and momentum of a particle with absolute certainty. This uncertainty is a result of the wave-like nature of particles at the quantum level.</p><h2>5. How does the probabilistic nature of quantum mechanics impact our understanding of reality?</h2><p>The probabilistic nature of quantum mechanics challenges our traditional understanding of reality, as it suggests that at the quantum level, particles do not have well-defined properties until they are measured. This concept is often described as "spooky action at a distance" and has led to many philosophical debates about the nature of reality and the role of observation in shaping it.</p>

1. What is the probabilistic nature of quantum mechanics?

The probabilistic nature of quantum mechanics refers to the fact that at the quantum level, particles do not have definite properties such as position or momentum. Instead, these properties are described by a probability distribution, and the outcome of a measurement is not certain but rather determined by chance.

2. Why is quantum mechanics considered to be probabilistic?

Quantum mechanics is considered to be probabilistic because it is based on the concept of wave-particle duality, which states that particles can behave as both waves and particles. This means that the position and momentum of a particle cannot be precisely determined at the same time, leading to the probabilistic nature of the theory.

3. How does the probabilistic nature of quantum mechanics differ from classical mechanics?

In classical mechanics, the behavior of particles is deterministic, meaning that the outcome of a measurement can be predicted with certainty. However, in quantum mechanics, the behavior of particles is described by a wave function that gives the probability of finding a particle in a particular state. This fundamental difference is what makes quantum mechanics probabilistic.

4. What is the role of uncertainty in the probabilistic nature of quantum mechanics?

Uncertainty is a fundamental principle of quantum mechanics and is related to the probabilistic nature of the theory. The Heisenberg uncertainty principle states that it is impossible to know both the position and momentum of a particle with absolute certainty. This uncertainty is a result of the wave-like nature of particles at the quantum level.

5. How does the probabilistic nature of quantum mechanics impact our understanding of reality?

The probabilistic nature of quantum mechanics challenges our traditional understanding of reality, as it suggests that at the quantum level, particles do not have well-defined properties until they are measured. This concept is often described as "spooky action at a distance" and has led to many philosophical debates about the nature of reality and the role of observation in shaping it.

Similar threads

Replies
21
Views
925
Replies
8
Views
894
Replies
12
Views
672
  • Quantum Physics
2
Replies
69
Views
4K
  • Quantum Physics
3
Replies
88
Views
6K
Replies
6
Views
1K
Replies
80
Views
3K
  • Quantum Physics
Replies
12
Views
2K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
  • Quantum Physics
2
Replies
43
Views
5K
Back
Top