Question on the probabilistic nature of QM

  • Thread starter Thread starter Wormaldson
  • Start date Start date
  • Tags Tags
    Nature Qm
Click For Summary
The discussion centers on the concept of genuine randomness in quantum mechanics (QM), particularly in relation to phenomena like the double-slit experiment. Participants express difficulty reconciling the idea of true randomness with classical determinism, questioning whether QM truly exhibits randomness without identifiable causes. They reference hidden variable theories and Bell's theorem, which suggest that many deterministic interpretations cannot replicate QM's predictions. The conversation also critiques the labeling of theories as fundamentally random or deterministic, emphasizing that scientific models are tools for understanding rather than definitive truths about the universe. Ultimately, the dialogue highlights the ongoing debate about the nature of reality as interpreted through quantum mechanics.
  • #31


bhobba said:
Examples that can be proven non-deterministic are zero
Fine, describe hypothetical example that can not be proven deterministic.
 
Physics news on Phys.org
  • #32


zonde said:
Fine, describe hypothetical example that can not be proven deterministic.

A sequence of random numbers created by a hardware random number generator based on random noise or the photo-electric effect which are quantum in origin. Describe to me the test that will prove 100% for sure it was not created by a deterministic process that pass such tests. Although deterministic pseudo random number generators that pass all randomness tests known are not trivial to come up with, they do exist - or so I have been told. Describe to me how you would tell the difference between the two? Exactly what test would you use?

I believe QM processes are fundamentally random and think those that want to resort to Bohmian Mechanics or whatever to regain determinism are whistling in the dark - but as a matter of principle I can't prove them wrong. Indeed another answer to your question is how would you tell the difference between standard QM and Bohmian Mechanics?

Thanks
Bill
 
Last edited:
  • #33


bhobba said:
... I truly and utterly believe QM is fundamentally random ...
Not sure what you mean by this. Are you saying that you believe that QM is some sort of probability theory? If so, I agree.

bhobba said:
... without any underlying deterministic process giving the appearance of randomness ...
Imo it wouldn't be best phrased as an underlying deterministic process giving the appearance of randomness, but rather our inability to track underlying processes. But the fact that many macroscopic processes on many scales are trackable and in accordance with deterministic laws would seem to indicate that the underlying processes are also deterministic (ie., lawful). Unless there's some reason to believe that the reality underlying instrumental behavior is essentially different from the macroscopic reality of our senses, and that our ignorance thereof is not just a matter of the limitations of our sensory capabilities.
 
Last edited:
  • #34


ThomasT said:
Not sure what you mean by this. Are you saying that you believe that QM is some sort of probability theory? If so, I agree.

I mean its fundamentally a probabilistic theory and not some kind of deterministic process masquerading as such, as say Bohmian Mechanics does. I firmly believe it is a fundamental probabilistic theory - I reject completely Einstein's idea it was incomplete - I have zero problem with God playing dice (although that of course was not Einstein's main objection - he was more concerned with an objective reality independent of observation - but as always his reasoning was subtle). I am simply pointing out as a matter of principle QM may be the limit or approximation or whatever of some deterministic process and there is no way it can be ruled out. I find it slightly puzzling why anyone would doubt it.

Thanks
Bill
 
Last edited:
  • #35


bhobba said:
I firmly believe it is a fundamental probabilistic theory...
I learned (what I remember of) QM in the probability interpretation.

bhobba said:
... I reject completely Einstein's idea it was incomplete ...
I think Einstein was correct. Qm is an incomplete theory of physical reality.

bhobba said:
I am simply pointing out as a matter of principle QM may be the limit or approximation or whatever of some deterministic process ...
I agree. In which case QM is an incomplete theory (in a certain sense), and, in any case, there's not a whole lot that anybody can say about the reality underlying instrumental behavior.
 
  • #36


bhobba said:
I mean its fundamentally a probabilistic theory and not some kind of deterministic process masquerading as such, as say Bohmian Mechanics does. I firmly believe it is a fundamental probabilistic theory - I reject completely Einstein's idea it was incomplete - I have zero problem with God playing dice (although that of course was not Einstein's main objection - he was more concerned with an objective reality independent of observation - but as always his reasoning was subtle). I am simply pointing out as a matter of principle QM may be the limit or approximation or whatever of some deterministic process and there is no way it can be ruled out. I find it slightly puzzling why anyone would doubt it.

Thanks
Bill

It is only your last statement I am commenting on, as the rest I agree with pretty well.

You might acknowledge that after 80+ years, there has not been the slightest bit of evidence - nor any plausible hypothesis other than perhaps Bohmian class theories - that any underlying deterministic mechanism exists in nature. In that light, I wouldn't find it surprising to doubt it exists. I doubt it, for instance.

So yes, certainly it is possible, no issue there. On the other hand, newer ideas such as the PBR theorem cast significant doubt that there can be a deterministic solution. If the quantum state is fundamental, then there is no determining factor to uncover.
 
  • #37


bhobba said:
Describe to me the test that will prove 100% for sure it was not created by a deterministic process that pass such tests.
I would say you misunderstood what I asked.
In simple words - you provide hypothetical example that demonstrates non-deterministic randomness and I try to provide test that should demonstrate that it is deterministic (according to our view of physical laws).

And please take into the account that this example should supposedly work as explanation for genuine randomness of QM i.e. I ask this question in context of OP:
Wormaldson said:
Problem is, I can't think of any classical situations in which this notion of genuine randomness actually applies.
 
  • #38


Wormaldson said:
So, finally, the question(s): a good place to start would certainly be, am I just interpreting the information wrong? Do we know for sure that quantum mechanics obeys this genuine-randomness-dependent behaviour? If not, then what do we suppose determines the behaviour of quantum mechanical phenomena? If so, then how is it that the behaviour is determined without a cause?

As always, any insight would be much appreciated. This has me quite puzzled.

Quantum mechanics--at least the most common type students learn--is statistically determinant, not random.

Do we have deterministic objects following statistical laws or statistical objects following deterministic laws? Does atomic structure quantize energy, or does energy quantize atomic structure? It doesn't matter which is which; any event would involve both aspects, so there is no difference.
Wormaldson said:
Problem is, I can't think of any classical situations in which this notion of genuine randomness actually applies.

Randomness is relative. In "classical situations", it's usually so small or uniform that we don't care about it, even though it's there. If the system is sensitive enough, however, we would notice. Of course, it doesn't have to be random in a uniform way, in which case again we might find patterns that appear "deterministic", along with some degree of accompanying "randomness".
 
Last edited:
  • #39


DrChinese said:
It is only your last statement I am commenting on, as the rest I agree with pretty well.

You might acknowledge that after 80+ years, there has not been the slightest bit of evidence - nor any plausible hypothesis other than perhaps Bohmian class theories - that any underlying deterministic mechanism exists in nature. In that light, I wouldn't find it surprising to doubt it exists. I doubt it, for instance.

So yes, certainly it is possible, no issue there. On the other hand, newer ideas such as the PBR theorem cast significant doubt that there can be a deterministic solution. If the quantum state is fundamental, then there is no determining factor to uncover.

I acknowledge and agree with everything you say. I am speaking of a matter of principle - not what I believe. IMHO standard QM is correct - BM etc and other outs are a crock.

Thanks
Bill
 
  • #40


zonde said:
I would say you misunderstood what I asked.
In simple words - you provide hypothetical example that demonstrates non-deterministic randomness and I try to provide test that should demonstrate that it is deterministic (according to our view of physical laws).

And please take into the account that this example should supposedly work as explanation for genuine randomness of QM i.e. I ask this question in context of OP:

Please be 100% clear what I am saying. I will repeat it again. I am saying there is no way by any test currently available you can tell a random sequence from one created by a well designed deterministic algorithm. If you want specifics let's say it was created by the Mersenne Twister algorithm. I give you such a sequence and you are required to tell me how you would determine if it is genuinely random or made by the twister.

The last part of your requirement - namely - 'should supposedly work as explanation for genuine randomness of QM' - is trivial because you can simply postulate that some unknown process at the sub quantum level mimics that algorithm. Is such - likely - hell no - it would be a totally silly and laughable hypothesis - but again this is a matter of principle - not of reasonableness. Reason, Occam's Razor, all sorts of stuff tells me QM is genuinely random.

If you think the above is outlandish and physically unreasonable you are correct. If that is your concern about what I am saying then let's pin it down to something more physically reasonable. I give you the results of a double slit experiment - namely the positions of the detected particles. Tell me how you would tell the difference between it being genuinely random and what is predicted by BM which is deterministic but the randomness is a result of factors not under control of the experimenter but that are presumably in principle knowable?

Thanks
Bill
 
Last edited:
  • #41


Quantum randomness is tied strongly with wavefunction collapse. It is in the same can of worms. "No-collapse" interpretations such as MWI or BM are automatically deterministic. The "appearance of collapse FAPP" naturally translates into "appearance of randomness FAPP" (where the "apparent randomness FAPP" is indistinguishable from "genuine randomness" by any experimental test, a notion I can comfortable live with). True 'genuine randomness' is equivalent to objective collapse. "Consciousness causes collapse" is translated into "consciousness is the source of randomness" etc. So by making a statement about the nature of randomness one implicitly adopts or rejects particular interpretation. Choose your poison.

Personally I don't see what the fuss is about. We know that quantum randomness only appears during measurement process. We also know that this process necessarily involves interaction of one microscopic system being measured with huge number of interacting microscopic systems making up measuring apparatus and its environment. It is only natural to expect that the initial state of the apparatus and/or the environment influences the outcome. Since we do not know the initial state (and cannot possibly know it all even if we tried, due to no-cloning theorem), it should be no surprise that the outcome appears random.
 
  • #42


bhobba said:
Please be 100% clear what I am saying. I will repeat it again. I am saying there is no way by any test currently available you can tell a random sequence from one created by a well designed deterministic algorithm. If you want specifics let's say it was created by the Mersenne Twister algorithm. I give you such a sequence and you are required to tell me how you would determine if it is genuinely random or made by the twister.
Well that's trivial - take the algorithm, take the same seed and you get the same result.
This of course is not genuine randomness as we can clearly identify cause, it's the seed. And with the same seed (the same cause) algorithm is always going to give the same result i.e. no randomness.
 
  • #43


Delta Kilo said:
It is only natural to expect that the initial state of the apparatus and/or the environment influences the outcome. Since we do not know the initial state (and cannot possibly know it all even if we tried, due to no-cloning theorem), it should be no surprise that the outcome appears random.

Indeed, that's very natural, and I guess nobody has a problem with this. However, the resulting randomness breaks the linearity of the evolution. There is no way a linear evolution can create outcomes that depend on the magnitude of components.

So the problem is quite a bit deeper than just identifying a source of randomness. You have to explain the nonlinearity of the observation and the exact distribution of the random outcomes.
 
  • #44


zonde said:
Scientific method (testing in particular) is based on concept of causation. As a result anything that can't be interpreted from perspective of causation is non-scientific.
This is sucn an interesting and important issue that it probably calls for its own thread, but I'll just answer briefly that it is highly debatable that the concept of cause is at all important in physics. I would go so far as to argue that the concept of a cause is not even definable in physics, the definition appears more at the level of human interaction with our environment, which is well separated from the laws themselves.

One simple reason for this is the tendency for the laws of physics to be time-reversible. One key ramification of this is that "what causes what" is very much a kind of sociological construct, that has a lot more to do with what we use science for that it has to do with the laws of physics. So I would agree that "causation is important in science", but that's because human interaction with, and involvement in, our environment is indeed important in science. Science is a human endeavor. But the laws can still be expressed in language that is completely devoid of "causes", and the laws are still the same laws-- it is just a popular way of interpreting the laws because it gibes well with what we use science to do.
 
  • #45


ThomasT said:
But the fact that many macroscopic processes on many scales are trackable and in accordance with deterministic laws would seem to indicate that the underlying processes are also deterministic (ie., lawful). Unless there's some reason to believe that the reality underlying instrumental behavior is essentially different from the macroscopic reality of our senses, and that our ignorance thereof is not just a matter of the limitations of our sensory capabilities.
But there is a very good reason to believe that-- it is almost inevitablly true! Why on Earth would our senses, which are presumably derived from a huge amalgamation of microscopic processes that we are trying to understand, not be essentially different from those processes? Are not the actions of an ant colony essentially different from what an individual ant is doing? Is not what a violinist is doing essentially different from what the particles in a violin are doing? I disagree with the implication that the default assumption is that our way of thinking about and interacting with reality should be the same as what reality is "actually doing", it seems clear to me that the default assumption should be that we are filtering reality to get it to serve our needs, needs that are extremely dependent on what humans are and what we want to do.

So when our filters give us results that allow deterministic interpretations of macro phenomena, we should always expect that to be emergent behavior, just as we expect the way a fluid flows through a nozzle to be emergent from what the atoms are actually doing, and what atoms are actually doing to be emergent from what quarks and fields are doing, and so on ad infinitum (and I say this without necessarily committing to the idea that the universe is built entirely bottom-up). We don't get to know what it is "emergent" from, because even that could also be emergent. We just have to recast what it is we are trying to know about reality.

The key point is that we have no difficultly interpreting seemingly deterministic behavior as emergent from random behavior, that's pretty much the field of statistical mechanics. Also, we have no difficulty interpreting seemingly random behavior as emergent from deterministic behavior, that is what Delta Kilo described so succinctly. These are all just interpretations, but we can't "reason by interpretation." Reality is just not going to give up these secrets, all we can do is make good models and interpret them however it works for us. Sometimes that leads to a consensus interpretation, sometimes it doesn't, but reality is not beholden to our interpretations, any more than you are limited to be what your dog thinks you are.
 
Last edited:
  • #46


Ken G said:
But there is a very good reason to believe that-- it is almost inevitablly true! Why on Earth would our senses, which are presumably derived from a huge amalgamation of microscopic processes that we are trying to understand, not be essentially different from those processes? Are not the actions of an ant colony essentially different from what an individual ant is doing? Is not what a violinist is doing essentially different from what the particles in a violin are doing?
It depends on what one is referring to by "essentially". In the context of this thread, I'm supposing that "essentially different" refers to lawful vs nonlawful (ie., deterministic vs nondeterministic) processes or evolutions. Ants, ant colonies, violins, violinists, orchestras, and everything else I can think of, all seem to evolve deterministically.

Beyond that, quantum experimental phenomena, and the theories and models associated with them, seem to me to indicate that the underlying physical world is composed of a vast hierarchy of particulate media. Since I can characterize the macroscopic world of my sensory experience in that way also, and since our sensory machinery is, afaik, vibratory ( that is, we detect frequencies wrt various media), and since there are so many examples of strikingly similar phenomena on so many different scales, then it seems logical to me to suppose that any and all behavior at any and all scales has a common ancestor or fundamental dynamical law(s) governing everything.

Ken G said:
... when our filters give us results that allow deterministic interpretations of macro phenomena, we should always expect that to be emergent behavior ...
I agree, and the notion of encompassing fundamental laws (ie., a fundamentally deterministic universe) is compatible with emergence.

Ken G said:
... I say this without necessarily committing to the idea that the universe is built entirely bottom-up ...
If you mean from small to large, then I agree. But the bottom, ie., the most fundamental, might also refer to behavioral principles or dynamical laws.

Ken G said:
The key point is that we have no difficultly interpreting seemingly deterministic behavior as emergent from random behavior, that's pretty much the field of statistical mechanics. Also, we have no difficulty interpreting seemingly random behavior as emergent from deterministic behavior, that is what Delta Kilo described so succinctly.
Yes, that seems to be the case.

Ken G said:
These are all just interpretations, but we can't "reason by interpretation." Reality is just not going to give up these secrets, all we can do is make good models and interpret them however it works for us.
So, aren't we reasoning, regarding the nature of reality, via interpretation?
 
  • #47


Ken G said:
This is sucn an interesting and important issue that it probably calls for its own thread, but I'll just answer briefly that it is highly debatable that the concept of cause is at all important in physics. I would go so far as to argue that the concept of a cause is not even definable in physics, the definition appears more at the level of human interaction with our environment, which is well separated from the laws themselves.
Basic concepts are not definable. Are you familiar with axiomatic systems and what are undefined terms in them?

Ken G said:
One simple reason for this is the tendency for the laws of physics to be time-reversible.
This is because we use math for formulation of laws a lot. Math works when quantities are conserved. When quantities are not conserved we combine different quantities so that combination is conserved. This is the bias introduced by extensive usage of math.

Ken G said:
One key ramification of this is that "what causes what" is very much a kind of sociological construct, that has a lot more to do with what we use science for that it has to do with the laws of physics. So I would agree that "causation is important in science", but that's because human interaction with, and involvement in, our environment is indeed important in science. Science is a human endeavor.
Any experimental test starts with things that we can do (cause) then from this point we can go further. So it's not just important it's the basis of science.

Ken G said:
But the laws can still be expressed in language that is completely devoid of "causes", and the laws are still the same laws-- it is just a popular way of interpreting the laws because it gibes well with what we use science to do.
Some simple example, please.
 
  • #48


Jazzdude said:
Indeed, that's very natural, and I guess nobody has a problem with this. However, the resulting randomness breaks the linearity of the evolution. There is no way a linear evolution can create outcomes that depend on the magnitude of components.

So the problem is quite a bit deeper than just identifying a source of randomness. You have to explain the nonlinearity of the observation and the exact distribution of the random outcomes.
I would agree that the problem is a bit deeper.
I would say that it's certain lack of randomness that is puzzling when we speak about interference rather than excess randomness. And it's similar with entanglement.
 
  • #49


ThomasT said:
It depends on what one is referring to by "essentially". In the context of this thread, I'm supposing that "essentially different" refers to lawful vs nonlawful (ie., deterministic vs nondeterministic) processes or evolutions. Ants, ant colonies, violins, violinists, orchestras, and everything else I can think of, all seem to evolve deterministically.
I see you are not fan of "systems" thinking, but rather are a strict reductionist? For myself, I see a lot of value in the "systems" viewpoint (that the action of complex systems is best understood as an interplay between top-down coupling constraints and bottom-up independent processes, than it is with a purely reductionist approach that the whole is understood purely by considering the elementary parts). But more to the point, I would certainly not say that what an orchestra is doing is strictly deterministic! It certainly cannot be demonstrated in detail to be deterministic, nor precisely predicted as a deterministic process, so the issue must boil down to whichever one views as the "default" assumption. I think many physicists are way too quick to picture determinism as the default, there really isn't any solid reasons to adopt that stance-- it's simple overinterpretation, in my view.
Beyond that, quantum experimental phenomena, and the theories and models associated with them, seem to me to indicate that the underlying physical world is composed of a vast hierarchy of particulate media.
But what do we mean "composed of"? Strictly composed of that? There's no question the particulate model is vastly important and successful, but so is the fields model, so at the very least we might wish to say the physical world is composed of particles and fields. But I wouldn't even say that-- I would just say our models invoke particles and fields, and what the "underlying physical world" is composed of is simply not a concept that physics needs, and we never get to know that, not even using physics.

Since I can characterize the macroscopic world of my sensory experience in that way also, and since our sensory machinery is, afaik, vibratory ( that is, we detect frequencies wrt various media), and since there are so many examples of strikingly similar phenomena on so many different scales, then it seems logical to me to suppose that any and all behavior at any and all scales has a common ancestor or fundamental dynamical law(s) governing everything.
Yes, the rationalistic view that laws "govern" reality, rather than reality "governs" what we will interpret as laws. That debate has raged as long as there has been thought about our environment, let me just say that an extremely unlikely proposition, and it has never stood the test of time, a fact we all too easily overlook.
I agree, and the notion of encompassing fundamental laws (ie., a fundamentally deterministic universe) is compatible with emergence.
Not really-- not unless you think that some phenomena emerge and other, more fundamental ones, don't. But if you hold, as I do, that all phenomena are emergent, and that there is never going to be any such thing as a fundamental process (nor does there need to be to do physics exactly as we do it), then the notion of encompassing fundamental laws is not compatible with emergence, because even the laws must emerge from something else (given that no law deals in the currency of something fundamental, but rather only in emergent phenomena). It seems a more natural "default" assumption, being the only one that actually has stood the test of time!

If you mean from small to large, then I agree.
I do, the common idea is that large phenomena emerge from small phenomena. But I'm not claiming that to be true, I think emergence can also cascade from large to small (as in the case of a violinist manipulating the instrument in a way that ultimately affects its atoms). But it is no longer important to specify what emerges from what if there is nothing fundamental that is "at the bottom" anyway.
So, aren't we reasoning, regarding the nature of reality, via interpretation?
I would argue no-- not if we are being precise about what we are doing. When we get a little casual about expressing what physics does, we often frame it as reasoning about the nature of reality, but Bohr had it right-- physics is what we can say about nature. I believe he meant that this means physics is not about nature herself, it is about our interaction with nature. We can interpret what we are doing around our interaction with nature, because we need to interpret our goals and objectives, but we are not interpreting the "nature of reality"-- as soon as you interpret that, it ain't the nature of reality any more.
 
  • #50


zonde said:
Basic concepts are not definable. Are you familiar with axiomatic systems and what are undefined terms in them?
If you hold that a "cause" is an axiom in physics, please specify a theory, any theory, that requires that in its axiomatic structure. I'm not aware of any, causes are sociological constructs we add on top of our theories to help us interpret them, no laws of physics refer to causes that I've ever heard of. This is clear from the simple fact that you would need to immediately remove from consideration any laws that are time reversible, so gone are Newton's laws, the Schroedinger equation, and general relativity.
Any experimental test starts with things that we can do (cause) then from this point we can go further. So it's not just important it's the basis of science.
No, you don't need to imagine you are causing something to do a scientific experiment. That we often do that is indeed our sociology, but it's not a requirement. If I drop a mass in my experiment, I never need to imagine that I "caused the mass to fall", or that gravity did, I am just setting up an experiment and watching what happens. No causation necessary, indeed causation brings in significant philosophical difficulties (around free will and so on). But I agree that we do invoke causation concepts constantly when we do science, and that's because science is a human endeavor, and humans use causation concepts in our daily lives all the time-- it's part of our sociology.
Some simple example, please.
Give me any phenomenon of your choosing that you feel must be described in terms of causes and effects, and I will offer a perfectly successful way to describe that same phenomenon without invoking those concepts at all.
 
  • #51


Ken G said:
I see you are not fan of "systems" thinking ...
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.

Ken G said:
... but rather are a strict reductionist?
Only in the behavioral (ie., wrt dynamical law) sense. Not wrt scales of size.

Ken G said:
For myself, I see a lot of value in the "systems" viewpoint (that the action of complex systems is best understood as an interplay between top-down coupling constraints and bottom-up independent processes, than it is with a purely reductionist approach that the whole is understood purely by considering the elementary parts).
I agree. Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.

Ken G said:
But more to the point, I would certainly not say that what an orchestra is doing is strictly deterministic!
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.

Ken G said:
It certainly cannot be demonstrated in detail to be deterministic, nor precisely predicted as a deterministic process, so the issue must boil down to whichever one views as the "default" assumption.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).

Ken G said:
I think many physicists are way too quick to picture determinism as the default, there really isn't any solid reasons to adopt that stance-- it's simple overinterpretation, in my view.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?

Ken G said:
But what do we mean "composed of"? Strictly composed of that?
Yes. Media, at any scale, which can be analysed in terms of their particular particulate constituents, but disturbances in which seem to be governed by fundamental dynamical law(s).

Ken G said:
There's no question the particulate model is vastly important and successful, but so is the fields model, so at the very least we might wish to say the physical world is composed of particles and fields.
Fields are just groupings of particles endowed with certain properties. Physical science hasn't yet gotten to explaining things in terms of, or positing, fundamental dynamical law(s).

Ken G said:
... I would just say our models invoke particles and fields, and what the "underlying physical world" is composed of is simply not a concept that physics needs, and we never get to know that, not even using physics.
I think that certain things can be inferred from the extant physics, and that as the field of instrumentation and detection advances, then even more will be able to be inferred about the reality underlying instrumental behavior.

Ken G said:
Yes, the rationalistic view that laws "govern" reality, rather than reality "governs" what we will interpret as laws. That debate has raged as long as there has been thought about our environment, let me just say that an extremely unlikely proposition, and it has never stood the test of time, a fact we all too easily overlook.
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?

Ken G said:
Not really-- not unless you think that some phenomena emerge and other, more fundamental ones, don't. But if you hold, as I do, that all phenomena are emergent, and that there is never going to be any such thing as a fundamental process (nor does there need to be to do physics exactly as we do it), then the notion of encompassing fundamental laws is not compatible with emergence, because even the laws must emerge from something else (given that no law deals in the currency of something fundamental, but rather only in emergent phenomena). It seems a more natural "default" assumption, being the only one that actually has stood the test of time!
This doesn't make any sense to me. I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.

Ken G said:
... the common idea is that large phenomena emerge from small phenomena. But I'm not claiming that to be true, I think emergence can also cascade from large to small (as in the case of a violinist manipulating the instrument in a way that ultimately affects its atoms). But it is no longer important to specify what emerges from what if there is nothing fundamental that is "at the bottom" anyway.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.

Ken G said:
I would argue no-- not if we are being precise about what we are doing. When we get a little casual about expressing what physics does, we often frame it as reasoning about the nature of reality, but Bohr had it right-- physics is what we can say about nature. I believe he meant that this means physics is not about nature herself, it is about our interaction with nature. We can interpret what we are doing around our interaction with nature, because we need to interpret our goals and objectives, but we are not interpreting the "nature of reality"-- as soon as you interpret that, it ain't the nature of reality any more.
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more. And of course, it's all subject to interpretation.
 
Last edited:
  • #52


zonde said:
Yes, cause is part of interpretation.


Let's say I do not believe you that it is possible, namely that physical phenomenon can be accurately predicted without concept of causation.

Scientific method (testing in particular) is based on concept of causation. As a result anything that can't be interpreted from perspective of causation is non-scientific.

i agree.
 
  • #53


ThomasT said:
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
 
  • #54


lugita15 said:
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
 
Last edited:
  • #55


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"

Yes, I completely agree. All the different interpretations of QM easily accommodate decoherence, yet their basic differences remain, as does their very different ways of dealing with the measurement problem.
 
  • #56


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf



-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.
 
  • #57


lugita15 said:
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.

Interesting paper.

And indeed there is disagreement on if decoherence solves the measurement problem. Most people (including me) seem to think it doesn't - what it does however is give the appearance of wave function collapse so for all practical purposes resolves the issue - but in a different way than the collapse problem was formulated. IMHO is removes the central mystery of the superposition principle in how a system can be partly in one state and partly in another so the normal rules of logic are cock-eyed and replaces it with a simple probability of being in one state or the other - but definitely in some state - not in this weird superposition.

Thanks
Bill
 
  • #58


ThomasT said:
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.
But that just isn't systems thinking. Systems thinking is that you can't understand systems adequately if all you use is bottom-up dynamical laws. If they thought you could, they wouldn't need systems thinking. The idea is that you cannot understand the interaction between top-down constraints and bottom-up dynamical laws if all you have is bottom-up dynamical laws, from which it follows that the universe cannot be "run" purely with bottom-up dynamical laws (even if you are inclined to imagine that the universe is "run" by any kind of mathematical structure).
Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.
The problem is, there is no way to parse that claim from the more simple statement "ontologies used to interpret and apply physics are based on dynamical laws/constraints." This is simply a statement of what defines physics, there is no need whatsoever to graduate it to a claim on the existence of anything. Indeed, the history of physics is quite clear that we do not need things to actually exist in order to use them quite effectively in physics (a glaring example being Newton's force of gravity, which is still used constantly in physics, even though its "existence" is deeply in doubt).
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.
I'm just going to let those words sit for awhile. Could there be a more clear example of pushing a preconception down nature's throat? I see this as a very common attitude in physics, but I would like to call it into question: the idea that we should regard a given attitude as true as long as we can rationalize it. This strikes me as just exactly what Popper complained about in regard to some theories of his day that were regarded as high science at the time, and which Popper felt were basically a fraud.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).
The evidence is that determinism isn't strictly true, but is a useful interpretation for making functionally successful predictions within limits. That is certainly not a preponderence of evidence that determinism is actually true at some unseen yet imagined deeper level. We have a name for that unseen deeper level: fantasy. All the same, it is in the mission statement of physics to look for effective determinism at the functional level we can actually observe, without any requirement to assume there exists some unseen deeper level where it's really true.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful).
But either of those assumptions is both unsubstantiated and unnecessary. You seem to overlook the more basic assumption: assume the universe is neither, it's just the universe. The idea that it has to be one or the other is simply mistaking the map for the territory, it's like saying we can either use a road map or a topographical map to navigate our path, so we must assume reality comprises fundamentally of either roads or mountains.
Fields are just groupings of particles endowed with certain properties.
Yet someone else can say that particles are just groupings of fields endowed with certain properties (and many do say that). There is no falsifiability in these claims, they are essentially personal philosophies. They are fine to use as devices for empowering your own approach to physics, but they are not, nor need to be, claims on what really is. This is actually a very good thing for physics, because physics would be quite impossible if it only worked if we could all agree on issues like whether particles or fields are more "fundamental." (Ask ten particle physicists to describe their own personal view of what a particle actually is, and be prepared to hear ten different answers. I know one who says "particles are a hoax".)
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?
I hear two totally different claims in the first and second part of that sentence, and an implication of an inference between them. The claim in the first part is just demonstrably how we do physics, so I have no issue with that. The claim at the end is kind of tacked on, with no necessary connection to the first part, and that is where the issue lies. There's a difference between using that second part as a philosophy behind one's own approach to the first part, versus claiming that the second part is a scientific inference from the first part. There is actually quite little evidence that the inference follows, and a host of evidence in the history of the trials and tribulations of science that it doesn't. Neither of those facts make the conclusion wrong-- they just don't make it right either. It doesn't follow.
I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.
I agree that we have no basis to say the universe is evolving fundamentally randomly, but we also have no basis to say it is evolving fundamentally deterministically. We have no basis to say it is "fundamentally" doing anything other that what we observe it to be doing. What is fundamental in physics is very much a moving target and always should be, for that is science. What is "fundamental in reality" is so impossible to define scientifically that I can't see why we even need the phrase.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.
I have no problem with you finding that reasonable. People find all kinds of things reasonable, for all kinds of personal reasons, and that is part of what you own, it is a right of having a brain. My issue is with the claim that this is somehow a logical inference based on evidence, when in fact the evidence is either absent, or to the contrary, as long as one avoids the trap of imagining that whatever is untested will still work. We need a "Murphy's law of science" (if a theory can be wrong, it will) to keep our views consistent with the actual history of this discipline!
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more.
What I wonder is, why do you think that your saying that is any different from Ptolemy saying it, or Newton? The history of physics is a history of great models that helped us understand and gain mastery over our environment, but it is not a history of our great models actually being the same as some "underlying reality." Instead, our great models have been like shadows, that fit some projection of reality but are later found to not be the reality. What I don't get it is, why do we have to keep pretending that this is not just exactly the whole point of physics?
 
Last edited:
  • #59


Ken G said:
If you hold that a "cause" is an axiom in physics, please specify a theory, any theory, that requires that in its axiomatic structure. I'm not aware of any, causes are sociological constructs we add on top of our theories to help us interpret them, no laws of physics refer to causes that I've ever heard of. This is clear from the simple fact that you would need to immediately remove from consideration any laws that are time reversible, so gone are Newton's laws, the Schroedinger equation, and general relativity.
No, I hold that "cause" is undefined term (or primitive notion) in science.
And it is used in formulation of prediction: "<this> causes <that>".

Ken G said:
No, you don't need to imagine you are causing something to do a scientific experiment. That we often do that is indeed our sociology, but it's not a requirement. If I drop a mass in my experiment, I never need to imagine that I "caused the mass to fall", or that gravity did, I am just setting up an experiment and watching what happens. No causation necessary, indeed causation brings in significant philosophical difficulties (around free will and so on). But I agree that we do invoke causation concepts constantly when we do science, and that's because science is a human endeavor, and humans use causation concepts in our daily lives all the time-- it's part of our sociology.
We imagine that we are free (our ideas are the main cause for particular design of experimental setup) to set up experiment as we want.

Ken G said:
Give me any phenomenon of your choosing that you feel must be described in terms of causes and effects, and I will offer a perfectly successful way to describe that same phenomenon without invoking those concepts at all.
x=vt or "velocity of the body causes linear change in position of the body".
 
  • #60


It is true that decoherence doesn't solve the measurement problem in that there's more work to do -- much in the same way that one can't claim the kinetic theory of gas explains the ideal gas law until you figure out how to actually quantify how pressure is an emergent property of particle interactions.But most of the objections I've seen aren't on the grounds that there's more work to do, but in that it's fundamentally missing the point, and this is where I have to disagree. The emergence of 'classical' probability distributions on relative states from unitary evolution suggests that 'absolute' definiteness is not a meaningful idea, in much the same way that Einstein's train thought experiment suggests that absolute simultaneity is not a meaningful idea.

In my estimation, the dissatisfaction with the decoherence solution to the measurement problem looks very much like a reluctance to give up the notion of absolute definiteness.

Instead, what we have is relative definiteness. Conditioned on the hypothesis that I toss a baseball upwards with a velocity v, the probability that it reaches a height of roughly v^2 / (2g) is (nearly) 1.

This fact does not require the belief that when 'God' looks at the universe, he sees that I have definitely thrown the baseball upwards with velocity v as opposed to some mixture or superposition or ensemble or whatever of various different possibilities.

Nor to derive this fact am I required to use a mathematical model that includes me definitely tossing a baseball upwards with velocity v as opposed to, e.g., using a state smeared out across configuration space.

But the assumption of absolute definiteness would insist on both things. And the habit of assuming absolute definiteness can be difficult to break -- one becomes so accustomed to phrasing questions absolutely that it becomes difficult to weaken it to a relative question. And to be fair, prior to QM there wasn't much incentive to do so.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 21 ·
Replies
21
Views
2K
  • · Replies 69 ·
3
Replies
69
Views
7K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 88 ·
3
Replies
88
Views
10K