Question on the probabilistic nature of QM

  • Thread starter Wormaldson
  • Start date
  • Tags
    Nature Qm
In summary, the conversation discusses the concept of "genuine randomness" in quantum mechanics and the difficulty in reconciling it with classical notions of causality and determinism. The speakers also mention hidden variable theories and Bell's theorem, which prove that certain deterministic versions of quantum mechanics are mathematically impossible. They also question the use of terms like "fundamental randomness" or "fundamental determinism" and argue that these are not scientifically testable concepts.
  • #71


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.

But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system

More ambitious approaches hope for macroscopic decoherence to be an emergent property of unitary evolution. The relative state interpretation (i.e. many worlds) studies unitary evolution directly and its effect on subsystems. Bohmian mechanics likewise keeps the indefiniteness of the wave-function, but shows its (definitely located) particles tend towards the distribution of the wave-function.

Even interpretations that aren't decoherence-based can allow for this indefiniteness. For example, Rovelli's paper on relational quantum mechanics analyzes the Wigner's Friend thought experiment and arguesto the effect that Wigner's analysis would be
My friend has opened the box and remains in an indefinite state, but one entangled with Schrödinger's cat. Their joint state collapsed to a live cat when I asked him about the results.​
and Wigner's friend's analysis would be
I opened the box and saw a live cat! I told Wigner when he asked.​
and both analyses would be equally valid. (actually, I'm not entirely sure if RQM is decoherence-based or collapse-based or agnostic about it. Really, I didn't like the paper other than this point of view on the Wigner's friend thought experiment, and don't remember the rest at all)
I liken the rejection of indefiniteness to the person who studies Newtonian mechanics but rather than setting up an inertial reference frame, instead carefully solves sets up coordinates in which the observer is always at the origin and at rest, and refuses to understand the laws of mechanics presented in any other coordinate system. After all, when he looks around, he always sees things from his perspective; working with a coordinate chart centered elsewhere would be nonphysical and meaningless!
yes, but macroreality is no indefinite.
maybe modal quantum theory with definite values is the answer.
or a nonlinear quantum mechanics destroying the superposition..
 
Last edited:
Physics news on Phys.org
  • #72


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.
Hmm, I do not see why there should be any reluctance to accept such indefiniteness.

Hurkyl said:
But this is the premise of the entire class of decoherence-based interpretations. Decoherence-upon-measurement even has the exact same mathematical form as collapse-upon-measurement, but without interpreting the probabilities as ignorance of the system
Well, this part is rather unclear. First, are you redefinig "probability" without giving new definition or what?
And second, how does it helps to resolve the mystery if you don't interpret the probabilities as ignorance of the system?

Hmm, maybe disagreement is actually about the mystery to be solved.
For example, as I see the mystery is not in indefiniteness but that this indefiniteness is carrying some amount of amorphous definiteness and about particular properties of this amorphous definiteness. And I don't see how your arguments are getting closer to resolution of this mystery.
 
  • #73


Hurkyl said:
I mean reluctance to accept indefiniteness -- that we can do physics well (or even merely adequately) when the states of our physical theory has objects for which allegedly physical questions O=a don't have definite true/false values, and especially when we continue to use such objects after an observation of O.
It's not clear to me what you mean by indefiniteness. If you just mean that exactly what one perceives is relative to one's viewpoint, then ok. But physical science has to do with publicly discernible/countable phenomena. In what way might these phenomena of public historical record be considered indefinite?

Of course, in any probabilistic formulation prior to observation there are encoded multiple possibilities given specific probability values. But then an experiment is done and a qualitative result recorded. Is there any sense in which a qualitative result should be considered indefinite?
 
Last edited:
  • #74


mainly because evolution equation (Schrodinger or whatever) is deterministic, so every experiment, idealized as the evolution equation applied to the system + the instrument should be deterministic.

one can never know the exact state of the instrument, and that adds an apparent randomness to the final state of the system.

an experiment is an interaction that makes the system leave the actual state (in contrafactual definiteness language, leave its properties) and forces it to go to some random (from the point of view of the scientist, not from the one of god -of whatever- that knows the exact state of the instrument) state

These are very good points, I almost agree with you. The only difficulty is that deterministic equation of evolution is not enough to call the theory deterministic. One also needs quantity that fully describes physical state of the system, with no reference to probability.

However, the function Psi describes the state in a probabilistic way. I do not think there is a way to understand Psi as a specification of a physical state. The only thing we know about it is that it gives probabilities.

It seems that if one wants to have deterministic theory, one also has to introduce additional quantitities.
 
  • #75


ThomasT said:
It's not clear to me what you mean by indefiniteness.
Consider first ordinary classical mechanics. We have a phase space X representing all possible states of some system under study, and an observable like the "x-coordinate of the third particle" is a function on X: to each element of the phase space it assigns a real number. An ordinary interpretation of ordinal classical mechanics would imply that the 'state of reality' corresponds to some point in X, and questions like "What is the x coordinate of the third particle?" make sense as questions about reality and have precise, definite answers.

Unfortunately, due to engineering concerns, we don't have sufficient knowledge and precision to actually answer such questions. So we layer a new theory on top of classical mechanics to try and model our ignorance. And if we look over all of the questions we say "Z happens with 75% probability", and it turns out we said that 100,000 times and roughly 75,000 of those were correct, we're content with our model -- both of reality and of our ignorance.Now, consider a variation on classical mechanics where phase space is not X, but instead the class of open subsets of X. In our interpretation, we do not say that the 'state of reality' corresponds to a point of X, but instead to an open subset of X. Questions like "What is the x coordinate of the third particle" no longer have definite answers, because the state of reality is some open subset U of X, and the value of our observable varies on the domain U.

So now assertions like "the x coordinate of the third particle is 3" still make sense as assertions about reality, but they do not necessarily have definite true/false values. Instead, they can also take on some 'intermediate' values between true and false. It might make more sense to think of it as being partially true and partially false. In fact, while the question above can definitely false, it can never be definitely true. A question like "the x coordinate of the third particle is between 3 and 4" could be definitely true, though.Another variation is rather than phase space being open subsets of X, they are probability distributions on X, in the sense of Kolmogorov. Now, the physical quantity "What is the x coordinate of the third particle?" again makes sense. But instead of being a (definite) real number, the answer to this question is a random variable (again, in the sense of Kolmogorov). Again, let me emphasize that, in the interpretation of this variant, the physical state of the system is a probability distribution, and physical quantities are random variables.

These last two variations are what I mean by indefiniteness.We can, of course, still layer ignorance probabilities on top of this. So we could be expressing our ignorance about the state of reality as being a probability distribution across the points of phase space -- that is, a probability distribution of probability distributions.

Mathematically, of course, we can simplify and collapse it all down into one giant probability distribution, and it looks the same as if we just had the original, definite classical mechanics with ignorance probabilities on top.

And "forgetting" about the physical distribution works out well because the dynamics of classical mechanics works "pointwise" on X, without influence from the physical probability distribution, so if we pretend the physical distribution is just ignorance probabilities, we never run into a paradox where the dynamics of the system appear to be influenced by our 'information' about it.
Before continuing further, the reader needs to understand that the third variant of classical mechanics I mentioned above, the physical probability distribution really is part of the physical state space of the theory. It is not a combined "classical mechanics + ignorance" amalgamation: it is a theory that posits the state of reality really is a probability distribution across X.
Now, if we turn to quantum mechanics, and decoherence in particular. The promising lead of decoherence is that if we apply unitary evolution to a large system and restrict our attention to the behavior of a subsystem, the state of the subsystem decoheres into something that can FAPP be described as a probability distribution across outcomes.

But the important thing to notice is that this probability distribution is an aspsect of the physical state space. It is not ignorance probability, it is part of the physical state of the system as posited by "Hilbert space + unitary evolution" (or similar notion).

But unlike the classical case, the dynamics do depend on the full state of the system. And we really do observe this in physical experiments.

The classic "Alice and Bob have an entangled pair of qubits" thought experiment, for example. Because of the entanglement, Alice's particle has decohered into a fully mixed state: mixture of 50% spin up and 50% spin down around whichever axis she chooses to measure. Any experiment performed entirely in her laboratory will respect this mixture. But when Alice and Bob compare their measurements, the full state of the system reasserts itself in showing a correlation between their measurements.In the third version of classical mechanics I described above, we can layer ignorance probabilities on top, then forget the difference between physical and ignorance probability -- in other words, replacing the decohered state with a collapsed state + ignorance about which state it collapsed to.

But forgetting the difference fails badly for quantum mechanics, because the dynamics do depend on the full state, and so if we're being forgetful, we do run into all sorts of issues where the physical evolution of a system appears to depend on our knowledge of the system.
 
  • #76


These are very good points, I almost agree with you. The only difficulty is that deterministic equation of evolution is not enough to call the theory deterministic. One also needs quantity that fully describes physical state of the system, with no reference to probability.

However, the function Psi describes the state in a probabilistic way. I do not think there is a way to understand Psi as a specification of a physical state. The only thing we know about it is that it gives probabilities.

It seems that if one wants to have deterministic theory, one also has to introduce additional quantitities.

Thanks for replying! I was really looking for if I was totally wrong or if you see some light. Let me tell you how I see it. For me, Psi is something that is useful to label states. Then, by knowing the symetries that the states should follow under space time rotations, you have what is the Hilbert operator representation of P, X, H and so on (and as a consequence its eigenstates).
Until here there are no probabilities, no Born Rule, nothing. Then we can interact with a system by the use of an instrument in a way that perhaps we can deterministically make it leave its, for example, P eigenstate and go to a predefined, for example, X eigenstate. Now again, there are no probabilities.
Finally we can make a game. We are going to arrange a complex interaction with an instrument, that makes the system go to an eigenstate of X and that has zillons of particles and we don't know the initial state of it. There is randomness, and probabilities, from the point of view of the scientist (but if there's a god, he would know the exact state of the instrument and he would know where the system will finnish). This is called "experiment" because it looks as if the scientist is discovering a property of the system (while he is not, the system does not have this "X" property. He is the one that makes it go to an eigenstate of it). Gleasons theorems (or Saunders) assures that if there are going to be probabilities and this probabilities depend only on the state of the system (here it is like this from the point of view of the scientist), then these should be calculated by the Born Rule.
So, in this interpretation, the Born Rule is not an axiom, just a theorem.

Finally I just want to say that I am obviously not sure of what I am saying. It is just the way I like things to be in order to make sense to me. I've been reading these stuff for the last 4 years and only now I found an interpretation that suits with my senses! Again, I have not found it in Wikipedia or anything, so I really appreciate your answers, views etc.

So... What do you think?

Thanks!
 
  • #77


I liked post #67 regarding the OP's question...

"... our great models have been like shadows, that fit some projection of reality but are later found to not be the reality..."

we ARE dealing with a mathematical model. It's as far as we have gotten to date.

The original question involved:

...that the phenomenon in question, whatever it may be, is genuinely random. That is to say, the exact, actual result has no identifiable cause...

I just posted these explanations on uncertainty in another thread and they may afford the OP a perspective regarding the representation [model] of what 'is actually happening' regarding 'probability'. These are a bit more basic than the Hilbert spaces and associated mathematical representations of the last half dozen or so posts.

Here are some explanations I saved [and edited] from a very, very long discussion in these forums on Heisenberg uncertainty:

Course Lecture Notes, Dr. Donald Luttermoser,East Tennessee State University:
The HUP strikes at the heart of classical physics: the trajectory. Obviously, if we cannot know the position and momentum of a particle at t[0] we cannot specify the initial conditions of the particle and hence cannot calculate the trajectory...Due to quantum mechanics probabilistic nature, only statistical information about aggregates of identical systems can be obtained. QM can tell us nothing about the behavior of individual systems. ….QUOTE]

what the quotes means: unlike classical physics, quantum physics means future predications of state [like position, momentum] are NOT precise.]

But why does this situation pop up in quantum mechanics? because of our mathematical representations.

[scattering, mentioned below, happens to be a convenient example of our limited ability to make measurements of arbitrary precision. ]The basic ideas are these: HUP [Heinsenberg uncertainty principle] is a result of nature, not of experimental based uncertainties. From the axioms of QM and the math that is used to build observables and states of systems, it turns out that position and velocity (and also momentum) are examples of what are called "canonical conjugates" [a function and its Fourier transform]; They cannot be both be "sharply localized". That is, they cannot be measured to an arbitrary level of precision. It is a mathematical fact that any function and its Fourier transform cannot both be made sharp.

The wave function describes not a single scattering particle but an ensemble of similarly accelerated particles. Physical systems [like particles] which have been subjected to the same state preparation will be similar in some of their properties but not all of them. The physical implication of the uncertainty principle is that NO STATE PREPARATION PROCEDURE IS POSSIBLE WHICH WOULD YIELD AN ENSEMBLE OF SYSTEMS IDENTICAL IN ALL OF THEIR OBSERVABLE PROPERTIES.

A few cornerstone mathematical underpinnings:
The wave function describes an ensemble of similarly prepared particles rather than a single scattering particle. A wave function with a well defined wavelength must have a large special extension, and conversely a wave function which is localized in a small region of space must be a Fourier synthesis of components with a wide range of wavelengths. We cannot measure them both to an arbitrary level of precision. A function and its Fourier transform cannot both be made sharp. This is a purely a mathematical fact and so has nothing to do with our ability to do experiments or our present-day technology. As long as QM is based on the present mathematical theory an arbitrary level of precision cannot be achieved. The HUP isn't about the knowledge of the conjugate observables of a single particle in a single measurement. that the uncertainty theorem is about the statistical distribution of the results of future measurements. The theorem doesn't say anything about whether you can measure both at the same time. That is a separate issue. A single scattering experiment consists of shooting a single particle at a target and measuring its angle of scatter. Quantum theory does not deal with such an experiment but rather with the statistical distribution of the results of an ensemble of similar results. Quantum theory predicts the statistical frequencies of the various angles through which a an ensemble of similarly prepared particles may be scattered.

What we can't do is to prepare a state such that we would be able to make an accurate prediction about what the result of a position measurement would be, and an accurate prediction about what the result of a momentum measurement would be.

Physical systems which have been subjected to the same state preparation will be similar in some of their properties but not all of them. The physical implication of the uncertainty principle is that NO STATE PREPARATION PROCEDURE IS POSSIBLE WHICH WOULD YIELD AN ENSEMBLE OF SYSTEMS IDENTICAL IN ALL OF THEIR OBSERVABLE PROPERTIES.
 
Last edited:
  • #78


bhobba said:
Ken is a wonder all right - his clarity of thought is awe inspiring and an excellent counterpoint to guys like me that side with Penrose and believe the math is the reality in a very literal sense.
Well, thank you bhobba and ThomasT, and I certainly agree with the implication that the value is not as much in the answer each of us arrives at, as it is in the tensions between the possible answers, around just what is this amazing relationship between us, our environment, and our mathematics that tries to connect the two. I suspect answers like this will continue to be moving targets, as much as what physics is will itself continue to be a moving target. It is not just physics theories that change, but physics itself. Although we may fall into thinking that the modern version is what physics "is", that doesn't seem to do justice to the remarkable evolutionary resilience of this animal. To me, its most remarkable attribute is the way the accuracy and generality of its predictions converges steadily, yet the ingenious ontologies it invokes to achieve this convergence of accuracy do not converge at all.
 
Last edited:
  • #79


Sorry for warming up this thread again.

In the context of the question this thread I have been asked mutliple times in here, what my reasons are making certain statements about decoherence, many worlds or generally how quantum theory should be interpreted. I found that it is hard to answer all these questions here, but they motivated me enough to start a blog where I can describe what I think is the best approach to the problem of interpretation. Namely replacing interpretation with scientific deduction, that describes observation as an emergent phenomenon. And no, this is not many worlds. The content is partly based on a research paper that is currently in the publication pipeline.

Since the resulting discussion could become speculative at some points and I want to respect the forum rules, I have moved the blog elsewhere. If you have comments and would like to discuss you are invited to do it there. In any case I am looking forward to your feedback.

http://aquantumoftheory.wordpress.com

Cheers,

Jazz
 
  • #80


nice to know.
 
  • #81


Ken G said:
... which is exactly the reason that people thought the Newtonian paradigm was correct long before we discovered quantum mechanics...

But don't overlook the fact that Quantum mechanics, and any future theory of motion we devise, needs to reduce to Newtonian mechanics when applied to the same domain in which Newtonian physics was established (QM of course does).

My point is any future development we might discover that supersedes QM will itself need to reduce to a purely random formulation when addressing the domain of particle interactions and states in which we know QM randomness to be accurate. I have a hard time imagining how any deterministic formulation could, in any reduced case, produce the purely random results needed in the quantum domain (it's a contradiction of terms in fact). It therefore seem to me a fair conclusion that reality is, at bottom -- fundamentally if you prefer -- probabilistic. And I think we must concede that even while admitting QM is unlikely to be the final say in our understanding of particles.

(Sorry, I know I am quoting a post from early in this thread, but it caught my eye)
 
  • #82


Jazzdude said:
Since the resulting discussion could become speculative at some points and I want to respect the forum rules, I have moved the blog elsewhere. If you have comments and would like to discuss you are invited to do it there. In any case I am looking forward to your feedback.

Hi Jazz

Interesting stuff.

I have my own view based on the primacy of observables and their basis invariance - I must get around to writing it up one day. Although its bog standard QM it is an approach I haven't seen anyone else use. Its a more sophisticated version of Victor Stengers view:
http://www.colorado.edu/philosophy/vstenger/Nothing/SuperPos.htm [Broken]

Thanks
Bill
 
Last edited by a moderator:
  • #83


Zmunkz said:
It therefore seem to me a fair conclusion that reality is, at bottom -- fundamentally if you prefer -- probabilistic. And I think we must concede that even while admitting QM is unlikely to be the final say in our understanding of particles.

I to think that at rock bottom reality is fundamentally probabilistic. QM may indeed be superseded but I am not so sure that such is likely.

Thanks
Bill
 
  • #84


I would argue that reality cannot be probabilistic at rock bottom, but neither does it appear to be deterministic. The error is in thinking it has to be one or the other-- that's not the case, our models have to be one or the other, reality can just be whatever it is.

The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories (probabilities reflect some process or information that is omitted on purpose, and probabilities are generated as placekeepers for what is omitted-- that's just what they are whenever we understand what we are actually doing). Hence, probability treatments are theories of what you are not treating, much more than they are theories of what you are treating. But if one rules out probabilistic theories from the status of "rock bottom" descriptions, one might imagine that all that is left is a deterministic description, but that's even worse-- there's no evidence that any physical description is exactly deterministic, determinism was always an effective concept in every application where it was ever used in practice. So probabilistic descriptions always have a "untreated underbelly", if you like, whereas deterministic descriptions are always effective at only an approximate level in all applications where they are used.

These are just factual statements about every example we can actually point at where we know what is going on, so why should we ever think they will not be true of some "final theory" that treats the "rock bottom" of reality? A much more natural conclusion seems to be that Bohr was right-- physics is what we can say about nature, and never was, nor ever should have been, a "rock bottom" description. We just don't get such a thing, we get determinations of probabilities, and that is all physics is intended to do.

As for "rock bottom" reality outside of what physics is intended to do, that is a fundamentally imprecise concept at best. Physics is all about creating a language that let's us talk about reality, so there is no such thing as a reality outside of physics that we could ever try to talk about intelligibly in a physics forum. Terms like "probabilistic" or "deterministic" are mathematical physics terms-- they have no meaning outside that context.
 
Last edited:
  • #85


Ken G said:
...The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories

You've outlined an interesting way of looking at this. Could you possibly elaborate on the above quotation? I'm trying to understand why by definition probabilistic theories cannot be foundational. I can see in the macro sense (something like flipping a coin for instance) probabilities are a stand-ins for actual non-probabilistic phenomenon... but I can't quite convince myself this analogy carries to everything. Could you maybe add a little on this?

Ken G said:
A much more natural conclusion seems to be that Bohr was right-- physics is what we can say about nature, and never was, nor ever should have been, a "rock bottom" description.

...

Physics is all about creating a language that let's us talk about reality, so there is no such thing as a reality outside of physics that we could ever try to talk about intelligibly in a physics forum. Terms like "probabilistic" or "deterministic" are mathematical physics terms-- they have no meaning outside that context.

This is the classic realist vs. instrumentalist debate. Looks like you fall on the instrumentalist side -- I am not sure if I can meet you there, although you make the case well.
 
  • #86


Ken G said:
The reason it can't be probabilistic is that probabilistic theories are, almost by definition, not rock-bottom theories (probabilities reflect some process or information that is omitted on purpose, and probabilities are generated as placekeepers for what is omitted-- that's just what they are whenever we understand what we are actually doing).

I wish I could up-vote. This is precisely why I am disturbed by a probabilistic end. To me it means there is a black curtain. Some might say then "your assuming there is something going on behind the curtain, and that's hidden variables". I say "no", there doesn't even have to be something deterministic going on behind the curtain, but there is a curtain nonetheless, and when physics is revealed it is always random. To be told that all we will ever get to see is what the curtain reveals is disturbing.
 
  • #87


jfy4 said:
To me it means there is a black curtain.

I must say I can't follow that one. To me probabilities are simply the result of stuff like Gleason's Theorem which shows determinism is not compatible with the definition of an observable. There are outs but to me they are ugly such as contextuality - of course what is ugly is in the eye of the beholder.

And observables to me are very intuitive since they are the most reasonable way to ensure basis invariance. Suppose there is a system and observational apparatus with n outcomes yi. Write them out as a vector sum yi |bi>. Problem is the yi are not invariant to a change in basis and since that is an entirely arbitrary man made thing it should be expressed in such a way as to be invariant. By changing the |bi> to |bi><bi| we have sum yi |bi><bi| which is a Hermitian operator whose eigenvalues are the possible outcomes of the measurement and basis invariant.

Thanks
Bill
 
  • #88


Zmunkz said:
You've outlined an interesting way of looking at this. Could you possibly elaborate on the above quotation? I'm trying to understand why by definition probabilistic theories cannot be foundational. I can see in the macro sense (something like flipping a coin for instance) probabilities are a stand-ins for actual non-probabilistic phenomenon... but I can't quite convince myself this analogy carries to everything.
It's not so much that I'm claiming it has to be true for everything, rather, I'm saying it is true every time we understand why our theory is probabilistic. So we can classify all our probabilistic theories into two bins-- one includes all the ones that we understand why a probabilistic theory works, and the other includes all the ones we don't understand. In that first bin, in every case the probabilistic theory works because it is a stand-in for all the processes the theory is not explicitly treating (flipping coins, shuffling cards, all of statistical mechanics and thermodynamics, etc.). In the second bin is just one thing: quantum mechanics.

So now we face two choices-- either there really are two such bins, and one of them holds "the rock bottom description", and all the rest hold every other type of probability description we've ever seen, or else there are not two such fundamentally different bins, there is just what we understand and what we do not. I can't say the latter interpretation is unequivocably superior, but when framed in these terms, I think it places that interpretation into a kind of proper perspective.
This is the classic realist vs. instrumentalist debate. Looks like you fall on the instrumentalist side -- I am not sure if I can meet you there, although you make the case well.
Yes, I agree this is well-worn territory. In a sense I am siding with Einstein that "the Old One does not roll dice," but I am differing from him in concluding, therefore, that straightforward realism is the only alternative. In fact, what most people call realism, I call unrealism-- it requires a dose of denial to hold that reality uniformly conforms to our macroscopic impressions of it, when the microscopic evidence is quite clear that it does not. So if there are no dice to roll, and if there is also no precise reality where everything has a position and a momentum and the future is entirely determined by the past, then what is left? What is left is the actual nature of reality. That's realism, if you ask me.
 
  • #89


jfy4 said:
To me it means there is a black curtain. Some might say then "your assuming there is something going on behind the curtain, and that's hidden variables". I say "no", there doesn't even have to be something deterministic going on behind the curtain, but there is a curtain nonetheless, and when physics is revealed it is always random. To be told that all we will ever get to see is what the curtain reveals is disturbing.
I agree with you about the curtain, but I find the implications less disturbing. It reminds me of the way Hoyle found the Big Bang to be disturbing-- he could not fathom an origin to the universe, anything but a steady state was disturbing. But I always wondered, why wasn't a steady state disturbing too, because of how it invokes a concept of a "forever" of events? We invoke "forever" to avoid a "start", or we invoke a "start" to avoid a "forever", yet which is less disturbing? I ask, why are we disturbed by mystery?

Yes, the goal of science is to penetrate the shroud of mystery, but it's not to remove the shroud, because behind one shroud of mystery is always another. We are not trying to pull down that "curtain" you speak of, because there will always be a curtain, and there is supposed to be a curtain-- our goal is to get past as many curtains as we can. That may sound disturbing, but isn't it more disturbing to imagine an end to the curtains?
 
  • #90


bhobba said:
I must say I can't follow that one. To me probabilities are simply the result of stuff like Gleason's Theorem which shows determinism is not compatible with the definition of an observable.
Gleason's theorem is a theorem about the theories of physics that can match observations, yet the "curtain" is an image about the connection between theories and reality. I think that is what you are not following there-- you are not distinguishing our theories from the way things "really work." I realize this is because of your rationalistic bent, you imagine that things really work according to some theory, and our goal is to either find that theory, or at least get as close as we can. That's a fine choice to make, rationalists abound who make that choice, and some get Nobel prizes pursuing it. But it's why you won't understand non-rationalists who don't think the world actually follows theories, because theories are in our brains, and the world is not beholden to our brains, only our language about the world is. The world is doing something that closely resembles following theories, but every time we think we have the theory it follows, we discover not just that the theory has its domain of applicability, but much more: we discover that the ontological constructs of the theory are completely different in some better theory. Why would we imagine that will ever not be true?
There are outs but to me they are ugly such as contextuality - of course what is ugly is in the eye of the beholder.
Contextuality is like determinism or probability, it is an aspect of a theory. We must never mistake the attributes of our theories for attributes of reality, or else we fall into the same trap that physicists have fallen for a half dozen times in the history of this science. When do we learn?
And observables to me are very intuitive since they are the most reasonable way to ensure basis invariance. Suppose there is a system and observational apparatus with n outcomes yi. Write them out as a vector sum yi |bi>. Problem is the yi are not invariant to a change in basis and since that is an entirely arbitrary man made thing it should be expressed in such a way as to be invariant. By changing the |bi> to |bi><bi| we have sum yi |bi><bi| which is a Hermitian operator whose eigenvalues are the possible outcomes of the measurement and basis invariant.
I think that's a lovely way to explain why observables are associated with operators, which is probably the most important thing one needs to understand to "get" quantum mechanics (that and why the basis transformations need to allow complex inner products, and I know you have some nice insights into that issue as well). Also, we can agree that the job of a physics theory is to connect reality to the things we can observe about it. But none of this tells us why a description of reality that connects our observables with mathematical structures that predict those observables has to be what reality actually is. There is a weird kind of "sitting the fence" between objectivism and subjectivism that is required to hold that stance-- you invoke subjectivism when you build the theory from the need to give invariant observables (rather than from some more fundamental constraint on the quantum state itself), yet ally with objectivism when you promote the resulting quantum theory to the level of a description of reality. If you instead simply say it is a description of how we observe reality, hence how we interact with reality, hence how we give language to our interaction with reality, then you arrive finally at Bohr's insight that physics is what we can say about reality.
 
  • #91


the problem with the probability in quantum physics is that it actually is not "rock bottom". if it were it would not cause so many troubles.

the problem is the equations of motion of any quantum theory provide a totally deterministic and even local theory. in a sense this part if very classic. but on top of that comes the probability (and non-local) part when one starts to measure. thus the probability arises somewhere in between of a deterministic theory sandwich at micro (QM equation of motion) and macro level (classical physics). because the theory lacks a well defined mechanism to provide when the collapse exactly happens it is very hard to tell the probability and the deterministic elements apart (you don't know when exactly the QM equations of motion become invalid and you have to apply the collapse instead).
 
  • #92


Ken G said:
Gleason's theorem is a theorem about the theories of physics that can match observations, yet the "curtain" is an image about the connection between theories and reality. I think that is what you are not following there-- you are not distinguishing our theories from the way things "really work." I realize this is because of your rationalistic bent, you imagine that things really work according to some theory, and our goal is to either find that theory, or at least get as close as we can. That's a fine choice to make, rationalists abound who make that choice, and some get Nobel prizes pursuing it. But it's why you won't understand non-rationalists who don't think the world actually follows theories, because theories are in our brains, and the world is not beholden to our brains, only our language about the world is. The world is doing something that closely resembles following theories, but every time we think we have the theory it follows, we discover not just that the theory has its domain of applicability, but much more: we discover that the ontological constructs of the theory are completely different in some better theory. Why would we imagine that will ever not be true?

Hi Ken

I have said it before and I will say it again. You are a wonder. Thats exactly it and exactly why I don't get it.

Reading you is like reading Wittgenstein - at first you say no he can't be right but you think about it a bit more and you realize he has a point. You may still not agree with him (and I don't) but he has a point.

Thanks
Bill
 
  • #93


Thanks bhobba, as you know my goal is not to change your mind, because your view is as valid as anyone else's, but merely to clarify the alternatives.
 
  • #94


Ken G said:
Yes, the goal of science is to penetrate the shroud of mystery, but it's not to remove the shroud, because behind one shroud of mystery is always another. We are not trying to pull down that "curtain" you speak of, because there will always be a curtain, and there is supposed to be a curtain-- our goal is to get past as many curtains as we can. That may sound disturbing, but isn't it more disturbing to imagine an end to the curtains?

I would love to pull down the curtain, only to find another, and if you got the opposite impression it wasn't my aim. But it's disturbing to me that this may be the last curtain.
 
  • #95


Ah, I see, you are not worried that we will pull this curtain down to find none behind it, you are worried we'll never pull this one down. Who knows, maybe we will, but I think it might take a better theory about how our minds process sensory information. If there's a universal wave function, we won't understand it until we understand where our consciousness inhabits it, and if there's no universal wave function, then we still have to understand why our perceptions are as if there were invariant collapses in one.
 
Last edited:
<h2>1. What is the probabilistic nature of quantum mechanics?</h2><p>The probabilistic nature of quantum mechanics refers to the fact that at the quantum level, particles do not have definite properties such as position or momentum. Instead, these properties are described by a probability distribution, and the outcome of a measurement is not certain but rather determined by chance.</p><h2>2. Why is quantum mechanics considered to be probabilistic?</h2><p>Quantum mechanics is considered to be probabilistic because it is based on the concept of wave-particle duality, which states that particles can behave as both waves and particles. This means that the position and momentum of a particle cannot be precisely determined at the same time, leading to the probabilistic nature of the theory.</p><h2>3. How does the probabilistic nature of quantum mechanics differ from classical mechanics?</h2><p>In classical mechanics, the behavior of particles is deterministic, meaning that the outcome of a measurement can be predicted with certainty. However, in quantum mechanics, the behavior of particles is described by a wave function that gives the probability of finding a particle in a particular state. This fundamental difference is what makes quantum mechanics probabilistic.</p><h2>4. What is the role of uncertainty in the probabilistic nature of quantum mechanics?</h2><p>Uncertainty is a fundamental principle of quantum mechanics and is related to the probabilistic nature of the theory. The Heisenberg uncertainty principle states that it is impossible to know both the position and momentum of a particle with absolute certainty. This uncertainty is a result of the wave-like nature of particles at the quantum level.</p><h2>5. How does the probabilistic nature of quantum mechanics impact our understanding of reality?</h2><p>The probabilistic nature of quantum mechanics challenges our traditional understanding of reality, as it suggests that at the quantum level, particles do not have well-defined properties until they are measured. This concept is often described as "spooky action at a distance" and has led to many philosophical debates about the nature of reality and the role of observation in shaping it.</p>

1. What is the probabilistic nature of quantum mechanics?

The probabilistic nature of quantum mechanics refers to the fact that at the quantum level, particles do not have definite properties such as position or momentum. Instead, these properties are described by a probability distribution, and the outcome of a measurement is not certain but rather determined by chance.

2. Why is quantum mechanics considered to be probabilistic?

Quantum mechanics is considered to be probabilistic because it is based on the concept of wave-particle duality, which states that particles can behave as both waves and particles. This means that the position and momentum of a particle cannot be precisely determined at the same time, leading to the probabilistic nature of the theory.

3. How does the probabilistic nature of quantum mechanics differ from classical mechanics?

In classical mechanics, the behavior of particles is deterministic, meaning that the outcome of a measurement can be predicted with certainty. However, in quantum mechanics, the behavior of particles is described by a wave function that gives the probability of finding a particle in a particular state. This fundamental difference is what makes quantum mechanics probabilistic.

4. What is the role of uncertainty in the probabilistic nature of quantum mechanics?

Uncertainty is a fundamental principle of quantum mechanics and is related to the probabilistic nature of the theory. The Heisenberg uncertainty principle states that it is impossible to know both the position and momentum of a particle with absolute certainty. This uncertainty is a result of the wave-like nature of particles at the quantum level.

5. How does the probabilistic nature of quantum mechanics impact our understanding of reality?

The probabilistic nature of quantum mechanics challenges our traditional understanding of reality, as it suggests that at the quantum level, particles do not have well-defined properties until they are measured. This concept is often described as "spooky action at a distance" and has led to many philosophical debates about the nature of reality and the role of observation in shaping it.

Similar threads

Replies
21
Views
925
Replies
8
Views
894
Replies
12
Views
672
  • Quantum Physics
2
Replies
69
Views
4K
  • Quantum Physics
3
Replies
88
Views
6K
Replies
6
Views
1K
Replies
80
Views
3K
  • Quantum Physics
Replies
12
Views
2K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
  • Quantum Physics
2
Replies
43
Views
5K
Back
Top