Question on the probabilistic nature of QM

  • Thread starter Thread starter Wormaldson
  • Start date Start date
  • Tags Tags
    Nature Qm
  • #51


Ken G said:
I see you are not fan of "systems" thinking ...
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.

Ken G said:
... but rather are a strict reductionist?
Only in the behavioral (ie., wrt dynamical law) sense. Not wrt scales of size.

Ken G said:
For myself, I see a lot of value in the "systems" viewpoint (that the action of complex systems is best understood as an interplay between top-down coupling constraints and bottom-up independent processes, than it is with a purely reductionist approach that the whole is understood purely by considering the elementary parts).
I agree. Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.

Ken G said:
But more to the point, I would certainly not say that what an orchestra is doing is strictly deterministic!
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.

Ken G said:
It certainly cannot be demonstrated in detail to be deterministic, nor precisely predicted as a deterministic process, so the issue must boil down to whichever one views as the "default" assumption.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).

Ken G said:
I think many physicists are way too quick to picture determinism as the default, there really isn't any solid reasons to adopt that stance-- it's simple overinterpretation, in my view.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?

Ken G said:
But what do we mean "composed of"? Strictly composed of that?
Yes. Media, at any scale, which can be analysed in terms of their particular particulate constituents, but disturbances in which seem to be governed by fundamental dynamical law(s).

Ken G said:
There's no question the particulate model is vastly important and successful, but so is the fields model, so at the very least we might wish to say the physical world is composed of particles and fields.
Fields are just groupings of particles endowed with certain properties. Physical science hasn't yet gotten to explaining things in terms of, or positing, fundamental dynamical law(s).

Ken G said:
... I would just say our models invoke particles and fields, and what the "underlying physical world" is composed of is simply not a concept that physics needs, and we never get to know that, not even using physics.
I think that certain things can be inferred from the extant physics, and that as the field of instrumentation and detection advances, then even more will be able to be inferred about the reality underlying instrumental behavior.

Ken G said:
Yes, the rationalistic view that laws "govern" reality, rather than reality "governs" what we will interpret as laws. That debate has raged as long as there has been thought about our environment, let me just say that an extremely unlikely proposition, and it has never stood the test of time, a fact we all too easily overlook.
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?

Ken G said:
Not really-- not unless you think that some phenomena emerge and other, more fundamental ones, don't. But if you hold, as I do, that all phenomena are emergent, and that there is never going to be any such thing as a fundamental process (nor does there need to be to do physics exactly as we do it), then the notion of encompassing fundamental laws is not compatible with emergence, because even the laws must emerge from something else (given that no law deals in the currency of something fundamental, but rather only in emergent phenomena). It seems a more natural "default" assumption, being the only one that actually has stood the test of time!
This doesn't make any sense to me. I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.

Ken G said:
... the common idea is that large phenomena emerge from small phenomena. But I'm not claiming that to be true, I think emergence can also cascade from large to small (as in the case of a violinist manipulating the instrument in a way that ultimately affects its atoms). But it is no longer important to specify what emerges from what if there is nothing fundamental that is "at the bottom" anyway.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.

Ken G said:
I would argue no-- not if we are being precise about what we are doing. When we get a little casual about expressing what physics does, we often frame it as reasoning about the nature of reality, but Bohr had it right-- physics is what we can say about nature. I believe he meant that this means physics is not about nature herself, it is about our interaction with nature. We can interpret what we are doing around our interaction with nature, because we need to interpret our goals and objectives, but we are not interpreting the "nature of reality"-- as soon as you interpret that, it ain't the nature of reality any more.
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more. And of course, it's all subject to interpretation.
 
Last edited:
Physics news on Phys.org
  • #52


zonde said:
Yes, cause is part of interpretation.


Let's say I do not believe you that it is possible, namely that physical phenomenon can be accurately predicted without concept of causation.

Scientific method (testing in particular) is based on concept of causation. As a result anything that can't be interpreted from perspective of causation is non-scientific.

i agree.
 
  • #53


ThomasT said:
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful). If the latter, then how are we to understand the emergence of physical laws at the level of our sensory apprehension?
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
 
  • #54


lugita15 said:
I assume you're asking how underlying nondeterministic laws of physics lead to us experiencing a world that seems to conform quite well to deterministic laws. Well, the answer to that is well-known. Decoherence explains how the randomness of quantum mechanics gives rise to the appearance that the macroscopic world conforms to classical physics.
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
 
Last edited:
  • #55


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf
-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"

Yes, I completely agree. All the different interpretations of QM easily accommodate decoherence, yet their basic differences remain, as does their very different ways of dealing with the measurement problem.
 
  • #56


yoda jedi said:
decoherence is not enough to explain or justify macroreality, classicality.

http://arxiv.org/pdf/quant-ph/0112095v3.pdf



-------
joos a leading adherent of decoherence:
"What decoherence tells us, is that certain objects appear classical when they are observed. But what is an observation? At
some stage, we still have to apply the usual probability rules of quantum theory"
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.
 
  • #57


lugita15 said:
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.

Interesting paper.

And indeed there is disagreement on if decoherence solves the measurement problem. Most people (including me) seem to think it doesn't - what it does however is give the appearance of wave function collapse so for all practical purposes resolves the issue - but in a different way than the collapse problem was formulated. IMHO is removes the central mystery of the superposition principle in how a system can be partly in one state and partly in another so the normal rules of logic are cock-eyed and replaces it with a simple probability of being in one state or the other - but definitely in some state - not in this weird superposition.

Thanks
Bill
 
  • #58


ThomasT said:
I think "systems" thinking is very appropriate and useful. But I think it reasonable to suppose that systems emerge from more fundamental, underlying, dynamical laws.
But that just isn't systems thinking. Systems thinking is that you can't understand systems adequately if all you use is bottom-up dynamical laws. If they thought you could, they wouldn't need systems thinking. The idea is that you cannot understand the interaction between top-down constraints and bottom-up dynamical laws if all you have is bottom-up dynamical laws, from which it follows that the universe cannot be "run" purely with bottom-up dynamical laws (even if you are inclined to imagine that the universe is "run" by any kind of mathematical structure).
Just that, since I think it reasonable to assume the existence of a fundamental dynamics (ie., fundamental dynamical laws/constraints) applicable to any behavioral scale, then I also suppose that no viable ontology or epistomology can be independent from the fundamental dynamical laws/constraints.
The problem is, there is no way to parse that claim from the more simple statement "ontologies used to interpret and apply physics are based on dynamical laws/constraints." This is simply a statement of what defines physics, there is no need whatsoever to graduate it to a claim on the existence of anything. Indeed, the history of physics is quite clear that we do not need things to actually exist in order to use them quite effectively in physics (a glaring example being Newton's force of gravity, which is still used constantly in physics, even though its "existence" is deeply in doubt).
There isn't anything that I can think of that can be said to be strictly deterministic on the macroscopic level of our sensory experience, in the sense of being devoid of unpredictable occurrences. But that doesn't contradict the inference of an underlying determinism.
I'm just going to let those words sit for awhile. Could there be a more clear example of pushing a preconception down nature's throat? I see this as a very common attitude in physics, but I would like to call it into question: the idea that we should regard a given attitude as true as long as we can rationalize it. This strikes me as just exactly what Popper complained about in regard to some theories of his day that were regarded as high science at the time, and which Popper felt were basically a fraud.
I think what it boils down to is the preponderance of evidence, which, imo, leads to the assumption of a fundamental determinism (ie., a universe evolving in accordance with fundamental dynamical law(s)).
The evidence is that determinism isn't strictly true, but is a useful interpretation for making functionally successful predictions within limits. That is certainly not a preponderence of evidence that determinism is actually true at some unseen yet imagined deeper level. We have a name for that unseen deeper level: fantasy. All the same, it is in the mission statement of physics to look for effective determinism at the functional level we can actually observe, without any requirement to assume there exists some unseen deeper level where it's really true.
There are only two alternatives, afaik. Either one chooses to assume that the universe is fundamentally deterministic (ie., lawful), or one chooses to assume that the universe is fundamentally indeterministic or nondeterministic (ie., nonlawful).
But either of those assumptions is both unsubstantiated and unnecessary. You seem to overlook the more basic assumption: assume the universe is neither, it's just the universe. The idea that it has to be one or the other is simply mistaking the map for the territory, it's like saying we can either use a road map or a topographical map to navigate our path, so we must assume reality comprises fundamentally of either roads or mountains.
Fields are just groupings of particles endowed with certain properties.
Yet someone else can say that particles are just groupings of fields endowed with certain properties (and many do say that). There is no falsifiability in these claims, they are essentially personal philosophies. They are fine to use as devices for empowering your own approach to physics, but they are not, nor need to be, claims on what really is. This is actually a very good thing for physics, because physics would be quite impossible if it only worked if we could all agree on issues like whether particles or fields are more "fundamental." (Ask ten particle physicists to describe their own personal view of what a particle actually is, and be prepared to hear ten different answers. I know one who says "particles are a hoax".)
What's wrong with the view that reality, and the limitations of our sensory capabilities, govern what we will interpret as laws, and that, also, there are laws that govern reality?
I hear two totally different claims in the first and second part of that sentence, and an implication of an inference between them. The claim in the first part is just demonstrably how we do physics, so I have no issue with that. The claim at the end is kind of tacked on, with no necessary connection to the first part, and that is where the issue lies. There's a difference between using that second part as a philosophy behind one's own approach to the first part, versus claiming that the second part is a scientific inference from the first part. There is actually quite little evidence that the inference follows, and a host of evidence in the history of the trials and tribulations of science that it doesn't. Neither of those facts make the conclusion wrong-- they just don't make it right either. It doesn't follow.
I'm not saying that you can fashion a workable physics based on the assumption of the existence of a fundamental dynamic(s), but only that this assumption is compatible with the exercise of scientific inquiry and the preponderance of physical evidence, and that the assumption that our world, our universe, is evolving fundamentally randomly isn't.
I agree that we have no basis to say the universe is evolving fundamentally randomly, but we also have no basis to say it is evolving fundamentally deterministically. We have no basis to say it is "fundamentally" doing anything other that what we observe it to be doing. What is fundamental in physics is very much a moving target and always should be, for that is science. What is "fundamental in reality" is so impossible to define scientifically that I can't see why we even need the phrase.
I think it reasonable to suppose that there is something fundamental, and that it has nothing to do with size.
I have no problem with you finding that reasonable. People find all kinds of things reasonable, for all kinds of personal reasons, and that is part of what you own, it is a right of having a brain. My issue is with the claim that this is somehow a logical inference based on evidence, when in fact the evidence is either absent, or to the contrary, as long as one avoids the trap of imagining that whatever is untested will still work. We need a "Murphy's law of science" (if a theory can be wrong, it will) to keep our views consistent with the actual history of this discipline!
Well, I disagree. I think that modern physical science has revealed certain things about the underlying reality, and that future science, assuming advances in technology, will reveal more.
What I wonder is, why do you think that your saying that is any different from Ptolemy saying it, or Newton? The history of physics is a history of great models that helped us understand and gain mastery over our environment, but it is not a history of our great models actually being the same as some "underlying reality." Instead, our great models have been like shadows, that fit some projection of reality but are later found to not be the reality. What I don't get it is, why do we have to keep pretending that this is not just exactly the whole point of physics?
 
Last edited:
  • #59


Ken G said:
If you hold that a "cause" is an axiom in physics, please specify a theory, any theory, that requires that in its axiomatic structure. I'm not aware of any, causes are sociological constructs we add on top of our theories to help us interpret them, no laws of physics refer to causes that I've ever heard of. This is clear from the simple fact that you would need to immediately remove from consideration any laws that are time reversible, so gone are Newton's laws, the Schroedinger equation, and general relativity.
No, I hold that "cause" is undefined term (or primitive notion) in science.
And it is used in formulation of prediction: "<this> causes <that>".

Ken G said:
No, you don't need to imagine you are causing something to do a scientific experiment. That we often do that is indeed our sociology, but it's not a requirement. If I drop a mass in my experiment, I never need to imagine that I "caused the mass to fall", or that gravity did, I am just setting up an experiment and watching what happens. No causation necessary, indeed causation brings in significant philosophical difficulties (around free will and so on). But I agree that we do invoke causation concepts constantly when we do science, and that's because science is a human endeavor, and humans use causation concepts in our daily lives all the time-- it's part of our sociology.
We imagine that we are free (our ideas are the main cause for particular design of experimental setup) to set up experiment as we want.

Ken G said:
Give me any phenomenon of your choosing that you feel must be described in terms of causes and effects, and I will offer a perfectly successful way to describe that same phenomenon without invoking those concepts at all.
x=vt or "velocity of the body causes linear change in position of the body".
 
  • #60


It is true that decoherence doesn't solve the measurement problem in that there's more work to do -- much in the same way that one can't claim the kinetic theory of gas explains the ideal gas law until you figure out how to actually quantify how pressure is an emergent property of particle interactions.But most of the objections I've seen aren't on the grounds that there's more work to do, but in that it's fundamentally missing the point, and this is where I have to disagree. The emergence of 'classical' probability distributions on relative states from unitary evolution suggests that 'absolute' definiteness is not a meaningful idea, in much the same way that Einstein's train thought experiment suggests that absolute simultaneity is not a meaningful idea.

In my estimation, the dissatisfaction with the decoherence solution to the measurement problem looks very much like a reluctance to give up the notion of absolute definiteness.

Instead, what we have is relative definiteness. Conditioned on the hypothesis that I toss a baseball upwards with a velocity v, the probability that it reaches a height of roughly v^2 / (2g) is (nearly) 1.

This fact does not require the belief that when 'God' looks at the universe, he sees that I have definitely thrown the baseball upwards with velocity v as opposed to some mixture or superposition or ensemble or whatever of various different possibilities.

Nor to derive this fact am I required to use a mathematical model that includes me definitely tossing a baseball upwards with velocity v as opposed to, e.g., using a state smeared out across configuration space.

But the assumption of absolute definiteness would insist on both things. And the habit of assuming absolute definiteness can be difficult to break -- one becomes so accustomed to phrasing questions absolutely that it becomes difficult to weaken it to a relative question. And to be fair, prior to QM there wasn't much incentive to do so.
 
  • #61


Hurkyl said:
But most of the objections I've seen aren't on the grounds that there's more work to do, but in that it's fundamentally missing the point, and this is where I have to disagree. The emergence of 'classical' probability distributions on relative states from unitary evolution suggests that 'absolute' definiteness is not a meaningful idea, in much the same way that Einstein's train thought experiment suggests that absolute simultaneity is not a meaningful idea.

In my estimation, the dissatisfaction with the decoherence solution to the measurement problem looks very much like a reluctance to give up the notion of absolute definiteness.
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?
 
  • #62


zonde said:
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?

I think he means a reluctance to accept that the world is basically not deterministic but rather can only be described in terms of probabilities. Decoherence does not tell us how a particular result is singled out - it only gives probabilities - but it does tell us a system is in one state only - not a weird combined state such as in Schroedinger's Cat where the cat is in a weird superposition of alive and dead - rather it is either alive or dead - but all you can predict is probabilities - no mechanism is offered on how alive or dead is determined.

Personally I have no problem with this at all and believe decoherence solves the basic problem of QM - but each to his/her own.

And indeed more work needs to be done - but to me the basic message is clear - leaking of phase to the environment stops systems in general being in a superposition of states. Of course there are exceptions such as superconductivity etc - but in the vast majority of situations here in the macro world QM weirdness is hidden by decoherence.

Thanks
Bill
 
Last edited:
  • #63


zonde said:
You mean reluctance to accept many worlds (realities)? Or reluctance to accept many possible interpretations of single world (reality)?
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.

But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system
bhobba said:
being in one state or the other - but definitely in some state - not in this weird superposition.
More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution. The relative state interpretation (i.e. many worlds) studies unitary evolution directly and its effect on subsystems. Bohmian mechanics likewise keeps the indefiniteness of the wave-function, but shows its (definitely located) particles tend towards the distribution of the wave-function.

Even interpretations that aren't decoherence-based can allow for this indefiniteness. For example, Rovelli's paper on relational quantum mechanics analyzes the Wigner's Friend thought experiment and arguesto the effect that Wigner's analysis would be
My friend has opened the box and remains in an indefinite state, but one entangled with Schrödinger's cat. Their joint state collapsed to a live cat when I asked him about the results.​
and Wigner's friend's analysis would be
I opened the box and saw a live cat! I told Wigner when he asked.​
and both analyses would be equally valid. (actually, I'm not entirely sure if RQM is decoherence-based or collapse-based or agnostic about it. Really, I didn't like the paper other than this point of view on the Wigner's friend thought experiment, and don't remember the rest at all)
I liken the rejection of indefiniteness to the person who studies Newtonian mechanics but rather than setting up an inertial reference frame, instead carefully solves sets up coordinates in which the observer is always at the origin and at rest, and refuses to understand the laws of mechanics presented in any other coordinate system. After all, when he looks around, he always sees things from his perspective; working with a coordinate chart centered elsewhere would be nonphysical and meaningless!
 
Last edited:
  • #64


Hurkyl said:
More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution.

True - but they need further investigation and development. Right now I am happy with phase being leaked.

Thanks
Bill
 
  • #65


I've already said, and nobody cared (but probably making justice for my ignorance!), that to me the universe is deterministic. But not in a Bohmian way, perhaps more in a Many Worlds way, but without the split of universes.
Im going to insist with my idea because I can't see what is wrong and I really don't like the random point of view mainly because evolution equation (Schrodinger or whatever) is deterministic, so every experiment, idealized as the evolution equation applied to the system + the instrument should be deterministic. So my way to unify the deterministic property of the evolution equation with the random nature of experiments is to say that one can never know the exact state of the instrument, and that adds an apparent randomness to the final state of the system.
Gleasons theorems states, in some way, that if in a Hilbert modeled system, there is going to be made an experiment and the result is random and only depends on the initial state of the system, then the probabilities should be calculated with the born rule. In this case, I say it again, the only way to introduce randomness in the experiment is by not knowing the exact state of the instrument, but making sure that this ignorance does not make the system go deterministically to one state (because in that situation that would not be called an experiment, just an "interaction").
Im sure there are a lot of imprecisions in my argument, but I can't see any flaw. However, I have never seen this point of view in Wikipedia or similar so I don't know if it is wrong or what!

I will really be very thankful for any point of view that you can provide

Ps: In this point of view, if the experiment is just letting time go by, then the appearence of randomness in the "experiment" is, I think, usually called, decoherence.
 
  • #66


Just to add some points, my idea is that an experiment is an interaction that makes the system leave the actual state (in contrafactual definiteness language, leave its properties) and forces it to go to some random (from the point of view of the scientist, not from the one of god -of whatever- that knows the exact state of the instrument) state (in contrafactual definiteness language, make the system choose some properties that it didnt have before).
Another point: There are not many worlds. Just one, chosen by the experiment "randomness".
So this point of view is not against the deterministic nature of the evolution equation (because it is indeed deterministic). It is not against our intuition that there is only one reality and not many worlds. And not against the probabilities of the born rule, because the idea is that, due to Gleasons theorems, the ignorance of the scientist manifests in the experiments by the emergence of the Born Rule probabilities (because, if the scientist sees probabilities -even though its nature depends on ignorance and not on "real randomness"- and if he makes the experiment in a way that its probabilities depend only on the Hilbert state representation, then the probabilities have to be calculated by the Born Rule).
Sorry for my imprecisions, hope you'll be able to follow my not so clear thoughts!
 
Last edited:
  • #67


Ken G said:
[...]
The history of physics is a history of great models that helped us understand and gain mastery over our environment, but it is not a history of our great models actually being the same as some "underlying reality." Instead, our great models have been like shadows, that fit some projection of reality but are later found to not be the reality. What I don't get it is, why do we have to keep pretending that this is not just exactly the whole point of physics?
Thanks for your clearly stated posts Ken. I think I pretty much agree with your answers to the OP's problem in particular, and your approach to how best to think about physical science in general.
 
  • #68


ThomasT said:
Thanks for your clearly stated posts Ken. I think I pretty much agree with your answers to the OP's problem in particular, and your approach to how best to think about physical science in general.

Ken is a wonder all right - his clarity of thought is awe inspiring and an excellent counterpoint to guys like me that side with Penrose and believe the math is the reality in a very literal sense.

Thanks
Bill
 
Last edited:
  • #69


lugita15 said:
There is some disagreement on the subject, but you may find this paper interesting. It's an attempt by Zurek, one of the developers of decoherence, to derive the Born rule via decoherence.

yes, I have read previously on zurech attempts, but i think the last solution it will come from a more wider theory, a nonlinear one like trace dynamics or an epistemic ontic model..
 
  • #70


bhobba said:
... guys like me that side with Penrose and believe the math is the reality in a very literal sense.
That view is somewhat puzzling to me. Perhaps you might post in the What's Your Philosophy of Mathematics? thread?
 
  • #71


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.

But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system

More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution. The relative state interpretation (i.e. many worlds) studies unitary evolution directly and its effect on subsystems. Bohmian mechanics likewise keeps the indefiniteness of the wave-function, but shows its (definitely located) particles tend towards the distribution of the wave-function.

Even interpretations that aren't decoherence-based can allow for this indefiniteness. For example, Rovelli's paper on relational quantum mechanics analyzes the Wigner's Friend thought experiment and arguesto the effect that Wigner's analysis would be
My friend has opened the box and remains in an indefinite state, but one entangled with Schrödinger's cat. Their joint state collapsed to a live cat when I asked him about the results.​
and Wigner's friend's analysis would be
I opened the box and saw a live cat! I told Wigner when he asked.​
and both analyses would be equally valid. (actually, I'm not entirely sure if RQM is decoherence-based or collapse-based or agnostic about it. Really, I didn't like the paper other than this point of view on the Wigner's friend thought experiment, and don't remember the rest at all)
I liken the rejection of indefiniteness to the person who studies Newtonian mechanics but rather than setting up an inertial reference frame, instead carefully solves sets up coordinates in which the observer is always at the origin and at rest, and refuses to understand the laws of mechanics presented in any other coordinate system. After all, when he looks around, he always sees things from his perspective; working with a coordinate chart centered elsewhere would be nonphysical and meaningless!
yes, but macroreality is no indefinite.
maybe modal quantum theory with definite values is the answer.
or a nonlinear quantum mechanics destroying the superposition..
 
Last edited:
  • #72


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.
Hmm, I do not see why there should be any reluctance to accept such indefiniteness.

Hurkyl said:
But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system
Well, this part is rather unclear. First, are you redefinig "probability" without giving new definition or what?
And second, how does it helps to resolve the mystery if you don't interpret the probabilities as ignorance of the system?

Hmm, maybe disagreement is actually about the mystery to be solved.
For example, as I see the mystery is not in indefiniteness but that this indefiniteness is carrying some amount of amorphous definiteness and about particular properties of this amorphous definiteness. And I don't see how your arguments are getting closer to resolution of this mystery.
 
  • #73


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.
It's not clear to me what you mean by indefiniteness. If you just mean that exactly what one perceives is relative to one's viewpoint, then ok. But physical science has to do with publicly discernible/countable phenomena. In what way might these phenomena of public historical record be considered indefinite?

Of course, in any probabilistic formulation prior to observation there are encoded multiple possibilities given specific probability values. But then an experiment is done and a qualitative result recorded. Is there any sense in which a qualitative result should be considered indefinite?
 
Last edited:
  • #74


mainly because evolution equation (Schrodinger or whatever) is deterministic, so every experiment, idealized as the evolution equation applied to the system + the instrument should be deterministic.

one can never know the exact state of the instrument, and that adds an apparent randomness to the final state of the system.

an experiment is an interaction that makes the system leave the actual state (in contrafactual definiteness language, leave its properties) and forces it to go to some random (from the point of view of the scientist, not from the one of god -of whatever- that knows the exact state of the instrument) state

These are very good points, I almost agree with you. The only difficulty is that deterministic equation of evolution is not enough to call the theory deterministic. One also needs quantity that fully describes physical state of the system, with no reference to probability.

However, the function Psi describes the state in a probabilistic way. I do not think there is a way to understand Psi as a specification of a physical state. The only thing we know about it is that it gives probabilities.

It seems that if one wants to have deterministic theory, one also has to introduce additional quantitities.
 
  • #75


ThomasT said:
It's not clear to me what you mean by indefiniteness.
Consider first ordinary classical mechanics. We have a phase space X representing all possible states of some system under study, and an observable like the "x-coordinate of the third particle" is a function on X: to each element of the phase space it assigns a real number. An ordinary interpretation of ordinal classical mechanics would imply that the 'state of reality' corresponds to some point in X, and questions like "What is the x coordinate of the third particle?" make sense as questions about reality and have precise, definite answers.

Unfortunately, due to engineering concerns, we don't have sufficient knowledge and precision to actually answer such questions. So we layer a new theory on top of classical mechanics to try and model our ignorance. And if we look over all of the questions we say "Z happens with 75% probability", and it turns out we said that 100,000 times and roughly 75,000 of those were correct, we're content with our model -- both of reality and of our ignorance.Now, consider a variation on classical mechanics where phase space is not X, but instead the class of open subsets of X. In our interpretation, we do not say that the 'state of reality' corresponds to a point of X, but instead to an open subset of X. Questions like "What is the x coordinate of the third particle" no longer have definite answers, because the state of reality is some open subset U of X, and the value of our observable varies on the domain U.

So now assertions like "the x coordinate of the third particle is 3" still make sense as assertions about reality, but they do not necessarily have definite true/false values. Instead, they can also take on some 'intermediate' values between true and false. It might make more sense to think of it as being partially true and partially false. In fact, while the question above can definitely false, it can never be definitely true. A question like "the x coordinate of the third particle is between 3 and 4" could be definitely true, though.Another variation is rather than phase space being open subsets of X, they are probability distributions on X, in the sense of Kolmogorov. Now, the physical quantity "What is the x coordinate of the third particle?" again makes sense. But instead of being a (definite) real number, the answer to this question is a random variable (again, in the sense of Kolmogorov). Again, let me emphasize that, in the interpretation of this variant, the physical state of the system is a probability distribution, and physical quantities are random variables.

These last two variations are what I mean by indefiniteness.We can, of course, still layer ignorance probabilities on top of this. So we could be expressing our ignorance about the state of reality as being a probability distribution across the points of phase space -- that is, a probability distribution of probability distributions.

Mathematically, of course, we can simplify and collapse it all down into one giant probability distribution, and it looks the same as if we just had the original, definite classical mechanics with ignorance probabilities on top.

And "forgetting" about the physical distribution works out well because the dynamics of classical mechanics works "pointwise" on X, without influence from the physical probability distribution, so if we pretend the physical distribution is just ignorance probabilities, we never run into a paradox where the dynamics of the system appear to be influenced by our 'information' about it.
Before continuing further, the reader needs to understand that the third variant of classical mechanics I mentioned above, the physical probability distribution really is part of the physical state space of the theory. It is not a combined "classical mechanics + ignorance" amalgamation: it is a theory that posits the state of reality really is a probability distribution across X.
Now, if we turn to quantum mechanics, and decoherence in particular. The promising lead of decoherence is that if we apply unitary evolution to a large system and restrict our attention to the behavior of a subsystem, the state of the subsystem decoheres into something that can FAPP be described as a probability distribution across outcomes.

But the important thing to notice is that this probability distribution is an aspsect of the physical state space. It is not ignorance probability, it is part of the physical state of the system as posited by "Hilbert space + unitary evolution" (or similar notion).

But unlike the classical case, the dynamics do depend on the full state of the system. And we really do observe this in physical experiments.

The classic "Alice and Bob have an entangled pair of qubits" thought experiment, for example. Because of the entanglement, Alice's particle has decohered into a fully mixed state: mixture of 50% spin up and 50% spin down around whichever axis she chooses to measure. Any experiment performed entirely in her laboratory will respect this mixture. But when Alice and Bob compare their measurements, the full state of the system reasserts itself in showing a correlation between their measurements.In the third version of classical mechanics I described above, we can layer ignorance probabilities on top, then forget the difference between physical and ignorance probability -- in other words, replacing the decohered state with a collapsed state + ignorance about which state it collapsed to.

But forgetting the difference fails badly for quantum mechanics, because the dynamics do depend on the full state, and so if we're being forgetful, we do run into all sorts of issues where the physical evolution of a system appears to depend on our knowledge of the system.
 
  • #76


These are very good points, I almost agree with you. The only difficulty is that deterministic equation of evolution is not enough to call the theory deterministic. One also needs quantity that fully describes physical state of the system, with no reference to probability.

However, the function Psi describes the state in a probabilistic way. I do not think there is a way to understand Psi as a specification of a physical state. The only thing we know about it is that it gives probabilities.

It seems that if one wants to have deterministic theory, one also has to introduce additional quantitities.

Thanks for replying! I was really looking for if I was totally wrong or if you see some light. Let me tell you how I see it. For me, Psi is something that is useful to label states. Then, by knowing the symetries that the states should follow under space time rotations, you have what is the Hilbert operator representation of P, X, H and so on (and as a consequence its eigenstates).
Until here there are no probabilities, no Born Rule, nothing. Then we can interact with a system by the use of an instrument in a way that perhaps we can deterministically make it leave its, for example, P eigenstate and go to a predefined, for example, X eigenstate. Now again, there are no probabilities.
Finally we can make a game. We are going to arrange a complex interaction with an instrument, that makes the system go to an eigenstate of X and that has zillons of particles and we don't know the initial state of it. There is randomness, and probabilities, from the point of view of the scientist (but if there's a god, he would know the exact state of the instrument and he would know where the system will finnish). This is called "experiment" because it looks as if the scientist is discovering a property of the system (while he is not, the system does not have this "X" property. He is the one that makes it go to an eigenstate of it). Gleasons theorems (or Saunders) assures that if there are going to be probabilities and this probabilities depend only on the state of the system (here it is like this from the point of view of the scientist), then these should be calculated by the Born Rule.
So, in this interpretation, the Born Rule is not an axiom, just a theorem.

Finally I just want to say that I am obviously not sure of what I am saying. It is just the way I like things to be in order to make sense to me. I've been reading these stuff for the last 4 years and only now I found an interpretation that suits with my senses! Again, I have not found it in Wikipedia or anything, so I really appreciate your answers, views etc.

So... What do you think?

Thanks!
 
  • #77


I liked post #67 regarding the OP's question...

"... our great models have been like shadows, that fit some projection of reality but are later found to not be the reality..."

we ARE dealing with a mathematical model. It's as far as we have gotten to date.

The original question involved:

...that the phenomenon in question, whatever it may be, is genuinely random. That is to say, the exact, actual result has no identifiable cause...

I just posted these explanations on uncertainty in another thread and they may afford the OP a perspective regarding the representation [model] of what 'is actually happening' regarding 'probability'. These are a bit more basic than the Hilbert spaces and associated mathematical representations of the last half dozen or so posts.

Here are some explanations I saved [and edited] from a very, very long discussion in these forums on Heisenberg uncertainty:

Course Lecture Notes, Dr. Donald Luttermoser,East Tennessee State University:
The HUP strikes at the heart of classical physics: the trajectory. Obviously, if we cannot know the position and momentum of a particle at t[0] we cannot specify the initial conditions of the particle and hence cannot calculate the trajectory...Due to quantum mechanics probabilistic nature, only statistical information about aggregates of identical systems can be obtained. QM can tell us nothing about the behavior of individual systems. ….QUOTE]

what the quotes means: unlike classical physics, quantum physics means future predications of state [like position, momentum] are NOT precise.]

But why does this situation pop up in quantum mechanics? because of our mathematical representations.

[scattering, mentioned below, happens to be a convenient example of our limited ability to make measurements of arbitrary precision. ]The basic ideas are these: HUP [Heinsenberg uncertainty principle] is a result of nature, not of experimental based uncertainties. From the axioms of QM and the math that is used to build observables and states of systems, it turns out that position and velocity (and also momentum) are examples of what are called "canonical conjugates" [a function and its Fourier transform]; They cannot be both be "sharply localized". That is, they cannot be measured to an arbitrary level of precision. It is a mathematical fact that any function and its Fourier transform cannot both be made sharp.

The wave function describes not a single scattering particle but an ensemble of similarly accelerated particles. Physical systems [like particles] which have been subjected to the same state preparation will be similar in some of their properties but not all of them. The physical implication of the uncertainty principle is that NO STATE PREPARATION PROCEDURE IS POSSIBLE WHICH WOULD YIELD AN ENSEMBLE OF SYSTEMS IDENTICAL IN ALL OF THEIR OBSERVABLE PROPERTIES.

A few cornerstone mathematical underpinnings:
The wave function describes an ensemble of similarly prepared particles rather than a single scattering particle. A wave function with a well defined wavelength must have a large special extension, and conversely a wave function which is localized in a small region of space must be a Fourier synthesis of components with a wide range of wavelengths. We cannot measure them both to an arbitrary level of precision. A function and its Fourier transform cannot both be made sharp. This is a purely a mathematical fact and so has nothing to do with our ability to do experiments or our present-day technology. As long as QM is based on the present mathematical theory an arbitrary level of precision cannot be achieved. The HUP isn't about the knowledge of the conjugate observables of a single particle in a single measurement. that the uncertainty theorem is about the statistical distribution of the results of future measurements. The theorem doesn't say anything about whether you can measure both at the same time. That is a separate issue. A single scattering experiment consists of shooting a single particle at a target and measuring its angle of scatter. Quantum theory does not deal with such an experiment but rather with the statistical distribution of the results of an ensemble of similar results. Quantum theory predicts the statistical frequencies of the various angles through which a an ensemble of similarly prepared particles may be scattered.

What we can't do is to prepare a state such that we would be able to make an accurate prediction about what the result of a position measurement would be, and an accurate prediction about what the result of a momentum measurement would be.

Physical systems which have been subjected to the same state preparation will be similar in some of their properties but not all of them. The physical implication of the uncertainty principle is that NO STATE PREPARATION PROCEDURE IS POSSIBLE WHICH WOULD YIELD AN ENSEMBLE OF SYSTEMS IDENTICAL IN ALL OF THEIR OBSERVABLE PROPERTIES.
 
Last edited:
  • #78


bhobba said:
Ken is a wonder all right - his clarity of thought is awe inspiring and an excellent counterpoint to guys like me that side with Penrose and believe the math is the reality in a very literal sense.
Well, thank you bhobba and ThomasT, and I certainly agree with the implication that the value is not as much in the answer each of us arrives at, as it is in the tensions between the possible answers, around just what is this amazing relationship between us, our environment, and our mathematics that tries to connect the two. I suspect answers like this will continue to be moving targets, as much as what physics is will itself continue to be a moving target. It is not just physics theories that change, but physics itself. Although we may fall into thinking that the modern version is what physics "is", that doesn't seem to do justice to the remarkable evolutionary resilience of this animal. To me, its most remarkable attribute is the way the accuracy and generality of its predictions converges steadily, yet the ingenious ontologies it invokes to achieve this convergence of accuracy do not converge at all.
 
Last edited:
  • #79


Sorry for warming up this thread again.

In the context of the question this thread I have been asked mutliple times in here, what my reasons are making certain statements about decoherence, many worlds or generally how quantum theory should be interpreted. I found that it is hard to answer all these questions here, but they motivated me enough to start a blog where I can describe what I think is the best approach to the problem of interpretation. Namely replacing interpretation with scientific deduction, that describes observation as an emergent phenomenon. And no, this is not many worlds. The content is partly based on a research paper that is currently in the publication pipeline.

Since the resulting discussion could become speculative at some points and I want to respect the forum rules, I have moved the blog elsewhere. If you have comments and would like to discuss you are invited to do it there. In any case I am looking forward to your feedback.

http://aquantumoftheory.wordpress.com

Cheers,

Jazz
 
  • #80


nice to know.
 
  • #81


Ken G said:
... which is exactly the reason that people thought the Newtonian paradigm was correct long before we discovered quantum mechanics...

But don't overlook the fact that Quantum mechanics, and any future theory of motion we devise, needs to reduce to Newtonian mechanics when applied to the same domain in which Newtonian physics was established (QM of course does).

My point is any future development we might discover that supersedes QM will itself need to reduce to a purely random formulation when addressing the domain of particle interactions and states in which we know QM randomness to be accurate. I have a hard time imagining how any deterministic formulation could, in any reduced case, produce the purely random results needed in the quantum domain (it's a contradiction of terms in fact). It therefore seem to me a fair conclusion that reality is, at bottom -- fundamentally if you prefer -- probabilistic. And I think we must concede that even while admitting QM is unlikely to be the final say in our understanding of particles.

(Sorry, I know I am quoting a post from early in this thread, but it caught my eye)
 
  • #82


Jazzdude said:
Since the resulting discussion could become speculative at some points and I want to respect the forum rules, I have moved the blog elsewhere. If you have comments and would like to discuss you are invited to do it there. In any case I am looking forward to your feedback.

Hi Jazz

Interesting stuff.

I have my own view based on the primacy of observables and their basis invariance - I must get around to writing it up one day. Although its bog standard QM it is an approach I haven't seen anyone else use. Its a more sophisticated version of Victor Stengers view:
http://www.colorado.edu/philosophy/vstenger/Nothing/SuperPos.htm

Thanks
Bill
 
Last edited by a moderator:
  • #83


Zmunkz said:
It therefore seem to me a fair conclusion that reality is, at bottom -- fundamentally if you prefer -- probabilistic. And I think we must concede that even while admitting QM is unlikely to be the final say in our understanding of particles.

I to think that at rock bottom reality is fundamentally probabilistic. QM may indeed be superseded but I am not so sure that such is likely.

Thanks
Bill
 
  • #84


I would argue that reality cannot be probabilistic at rock bottom, but neither does it appear to be deterministic. The error is in thinking it has to be one or the other-- that's not the case, our models have to be one or the other, reality can just be whatever it is.

The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories (probabilities reflect some process or information that is omitted on purpose, and probabilities are generated as placekeepers for what is omitted-- that's just what they are whenever we understand what we are actually doing). Hence, probability treatments are theories of what you are not treating, much more than they are theories of what you are treating. But if one rules out probabilistic theories from the status of "rock bottom" descriptions, one might imagine that all that is left is a deterministic description, but that's even worse-- there's no evidence that any physical description is exactly deterministic, determinism was always an effective concept in every application where it was ever used in practice. So probabilistic descriptions always have a "untreated underbelly", if you like, whereas deterministic descriptions are always effective at only an approximate level in all applications where they are used.

These are just factual statements about every example we can actually point at where we know what is going on, so why should we ever think they will not be true of some "final theory" that treats the "rock bottom" of reality? A much more natural conclusion seems to be that Bohr was right-- physics is what we can say about nature, and never was, nor ever should have been, a "rock bottom" description. We just don't get such a thing, we get determinations of probabilities, and that is all physics is intended to do.

As for "rock bottom" reality outside of what physics is intended to do, that is a fundamentally imprecise concept at best. Physics is all about creating a language that let's us talk about reality, so there is no such thing as a reality outside of physics that we could ever try to talk about intelligibly in a physics forum. Terms like "probabilistic" or "deterministic" are mathematical physics terms-- they have no meaning outside that context.
 
Last edited:
  • #85


Ken G said:
...The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories

You've outlined an interesting way of looking at this. Could you possibly elaborate on the above quotation? I'm trying to understand why by definition probabilistic theories cannot be foundational. I can see in the macro sense (something like flipping a coin for instance) probabilities are a stand-ins for actual non-probabilistic phenomenon... but I can't quite convince myself this analogy carries to everything. Could you maybe add a little on this?

Ken G said:
A much more natural conclusion seems to be that Bohr was right-- physics is what we can say about nature, and never was, nor ever should have been, a "rock bottom" description.

...

Physics is all about creating a language that let's us talk about reality, so there is no such thing as a reality outside of physics that we could ever try to talk about intelligibly in a physics forum. Terms like "probabilistic" or "deterministic" are mathematical physics terms-- they have no meaning outside that context.

This is the classic realist vs. instrumentalist debate. Looks like you fall on the instrumentalist side -- I am not sure if I can meet you there, although you make the case well.
 
  • #86


Ken G said:
The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories (probabilities reflect some process or information that is omitted on purpose, and probabilities are generated as placekeepers for what is omitted-- that's just what they are whenever we understand what we are actually doing).

I wish I could up-vote. This is precisely why I am disturbed by a probabilistic end. To me it means there is a black curtain. Some might say then "your assuming there is something going on behind the curtain, and that's hidden variables". I say "no", there doesn't even have to be something deterministic going on behind the curtain, but there is a curtain nonetheless, and when physics is revealed it is always random. To be told that all we will ever get to see is what the curtain reveals is disturbing.
 
  • #87


jfy4 said:
To me it means there is a black curtain.

I must say I can't follow that one. To me probabilities are simply the result of stuff like Gleason's Theorem which shows determinism is not compatible with the definition of an observable. There are outs but to me they are ugly such as contextuality - of course what is ugly is in the eye of the beholder.

And observables to me are very intuitive since they are the most reasonable way to ensure basis invariance. Suppose there is a system and observational apparatus with n outcomes yi. Write them out as a vector sum yi |bi>. Problem is the yi are not invariant to a change in basis and since that is an entirely arbitrary man made thing it should be expressed in such a way as to be invariant. By changing the |bi> to |bi><bi| we have sum yi |bi><bi| which is a Hermitian operator whose eigenvalues are the possible outcomes of the measurement and basis invariant.

Thanks
Bill
 
  • #88


Zmunkz said:
You've outlined an interesting way of looking at this. Could you possibly elaborate on the above quotation? I'm trying to understand why by definition probabilistic theories cannot be foundational. I can see in the macro sense (something like flipping a coin for instance) probabilities are a stand-ins for actual non-probabilistic phenomenon... but I can't quite convince myself this analogy carries to everything.
It's not so much that I'm claiming it has to be true for everything, rather, I'm saying it is true every time we understand why our theory is probabilistic. So we can classify all our probabilistic theories into two bins-- one includes all the ones that we understand why a probabilistic theory works, and the other includes all the ones we don't understand. In that first bin, in every case the probabilistic theory works because it is a stand-in for all the processes the theory is not explicitly treating (flipping coins, shuffling cards, all of statistical mechanics and thermodynamics, etc.). In the second bin is just one thing: quantum mechanics.

So now we face two choices-- either there really are two such bins, and one of them holds "the rock bottom description", and all the rest hold every other type of probability description we've ever seen, or else there are not two such fundamentally different bins, there is just what we understand and what we do not. I can't say the latter interpretation is unequivocably superior, but when framed in these terms, I think it places that interpretation into a kind of proper perspective.
This is the classic realist vs. instrumentalist debate. Looks like you fall on the instrumentalist side -- I am not sure if I can meet you there, although you make the case well.
Yes, I agree this is well-worn territory. In a sense I am siding with Einstein that "the Old One does not roll dice," but I am differing from him in concluding, therefore, that straightforward realism is the only alternative. In fact, what most people call realism, I call unrealism-- it requires a dose of denial to hold that reality uniformly conforms to our macroscopic impressions of it, when the microscopic evidence is quite clear that it does not. So if there are no dice to roll, and if there is also no precise reality where everything has a position and a momentum and the future is entirely determined by the past, then what is left? What is left is the actual nature of reality. That's realism, if you ask me.
 
  • #89


jfy4 said:
To me it means there is a black curtain. Some might say then "your assuming there is something going on behind the curtain, and that's hidden variables". I say "no", there doesn't even have to be something deterministic going on behind the curtain, but there is a curtain nonetheless, and when physics is revealed it is always random. To be told that all we will ever get to see is what the curtain reveals is disturbing.
I agree with you about the curtain, but I find the implications less disturbing. It reminds me of the way Hoyle found the Big Bang to be disturbing-- he could not fathom an origin to the universe, anything but a steady state was disturbing. But I always wondered, why wasn't a steady state disturbing too, because of how it invokes a concept of a "forever" of events? We invoke "forever" to avoid a "start", or we invoke a "start" to avoid a "forever", yet which is less disturbing? I ask, why are we disturbed by mystery?

Yes, the goal of science is to penetrate the shroud of mystery, but it's not to remove the shroud, because behind one shroud of mystery is always another. We are not trying to pull down that "curtain" you speak of, because there will always be a curtain, and there is supposed to be a curtain-- our goal is to get past as many curtains as we can. That may sound disturbing, but isn't it more disturbing to imagine an end to the curtains?
 
  • #90


bhobba said:
I must say I can't follow that one. To me probabilities are simply the result of stuff like Gleason's Theorem which shows determinism is not compatible with the definition of an observable.
Gleason's theorem is a theorem about the theories of physics that can match observations, yet the "curtain" is an image about the connection between theories and reality. I think that is what you are not following there-- you are not distinguishing our theories from the way things "really work." I realize this is because of your rationalistic bent, you imagine that things really work according to some theory, and our goal is to either find that theory, or at least get as close as we can. That's a fine choice to make, rationalists abound who make that choice, and some get Nobel prizes pursuing it. But it's why you won't understand non-rationalists who don't think the world actually follows theories, because theories are in our brains, and the world is not beholden to our brains, only our language about the world is. The world is doing something that closely resembles following theories, but every time we think we have the theory it follows, we discover not just that the theory has its domain of applicability, but much more: we discover that the ontological constructs of the theory are completely different in some better theory. Why would we imagine that will ever not be true?
There are outs but to me they are ugly such as contextuality - of course what is ugly is in the eye of the beholder.
Contextuality is like determinism or probability, it is an aspect of a theory. We must never mistake the attributes of our theories for attributes of reality, or else we fall into the same trap that physicists have fallen for a half dozen times in the history of this science. When do we learn?
And observables to me are very intuitive since they are the most reasonable way to ensure basis invariance. Suppose there is a system and observational apparatus with n outcomes yi. Write them out as a vector sum yi |bi>. Problem is the yi are not invariant to a change in basis and since that is an entirely arbitrary man made thing it should be expressed in such a way as to be invariant. By changing the |bi> to |bi><bi| we have sum yi |bi><bi| which is a Hermitian operator whose eigenvalues are the possible outcomes of the measurement and basis invariant.
I think that's a lovely way to explain why observables are associated with operators, which is probably the most important thing one needs to understand to "get" quantum mechanics (that and why the basis transformations need to allow complex inner products, and I know you have some nice insights into that issue as well). Also, we can agree that the job of a physics theory is to connect reality to the things we can observe about it. But none of this tells us why a description of reality that connects our observables with mathematical structures that predict those observables has to be what reality actually is. There is a weird kind of "sitting the fence" between objectivism and subjectivism that is required to hold that stance-- you invoke subjectivism when you build the theory from the need to give invariant observables (rather than from some more fundamental constraint on the quantum state itself), yet ally with objectivism when you promote the resulting quantum theory to the level of a description of reality. If you instead simply say it is a description of how we observe reality, hence how we interact with reality, hence how we give language to our interaction with reality, then you arrive finally at Bohr's insight that physics is what we can say about reality.
 
  • #91


the problem with the probability in quantum physics is that it actually is not "rock bottom". if it were it would not cause so many troubles.

the problem is the equations of motion of any quantum theory provide a totally deterministic and even local theory. in a sense this part if very classic. but on top of that comes the probability (and non-local) part when one starts to measure. thus the probability arises somewhere in between of a deterministic theory sandwich at micro (QM equation of motion) and macro level (classical physics). because the theory lacks a well defined mechanism to provide when the collapse exactly happens it is very hard to tell the probability and the deterministic elements apart (you don't know when exactly the QM equations of motion become invalid and you have to apply the collapse instead).
 
  • #92


Ken G said:
Gleason's theorem is a theorem about the theories of physics that can match observations, yet the "curtain" is an image about the connection between theories and reality. I think that is what you are not following there-- you are not distinguishing our theories from the way things "really work." I realize this is because of your rationalistic bent, you imagine that things really work according to some theory, and our goal is to either find that theory, or at least get as close as we can. That's a fine choice to make, rationalists abound who make that choice, and some get Nobel prizes pursuing it. But it's why you won't understand non-rationalists who don't think the world actually follows theories, because theories are in our brains, and the world is not beholden to our brains, only our language about the world is. The world is doing something that closely resembles following theories, but every time we think we have the theory it follows, we discover not just that the theory has its domain of applicability, but much more: we discover that the ontological constructs of the theory are completely different in some better theory. Why would we imagine that will ever not be true?

Hi Ken

I have said it before and I will say it again. You are a wonder. Thats exactly it and exactly why I don't get it.

Reading you is like reading Wittgenstein - at first you say no he can't be right but you think about it a bit more and you realize he has a point. You may still not agree with him (and I don't) but he has a point.

Thanks
Bill
 
  • #93


Thanks bhobba, as you know my goal is not to change your mind, because your view is as valid as anyone else's, but merely to clarify the alternatives.
 
  • #94


Ken G said:
Yes, the goal of science is to penetrate the shroud of mystery, but it's not to remove the shroud, because behind one shroud of mystery is always another. We are not trying to pull down that "curtain" you speak of, because there will always be a curtain, and there is supposed to be a curtain-- our goal is to get past as many curtains as we can. That may sound disturbing, but isn't it more disturbing to imagine an end to the curtains?

I would love to pull down the curtain, only to find another, and if you got the opposite impression it wasn't my aim. But it's disturbing to me that this may be the last curtain.
 
  • #95


Ah, I see, you are not worried that we will pull this curtain down to find none behind it, you are worried we'll never pull this one down. Who knows, maybe we will, but I think it might take a better theory about how our minds process sensory information. If there's a universal wave function, we won't understand it until we understand where our consciousness inhabits it, and if there's no universal wave function, then we still have to understand why our perceptions are as if there were invariant collapses in one.
 
Last edited:
Back
Top