Do weak measurement prove randomness is not inherent?

San K
Messages
905
Reaction score
1
Weak measurement show that you can get "partial/probabilistic" which-way info and get a "partial" interference pattern.

Deduction 1:

Does this mean that weak measurements prove that we can control the degree of randomness?

(either of individual photons or average of a thousands of photons)

Deduction 2:

If we can control the degree of randomness, does it mean that randomness is not inherent BUTRandomness is simply a way to describe forces/phenomena/dimension that we are unaware of and thus describe/model it stochastically (for example how we do with Brownian movement)

Deduction 1 in my opinion requires a lesser "leap of faith" as its follows out fairly logical.
 
Physics news on Phys.org
No. The "randomness" you are referring to is a fundamental limit imposed by the HUP. That is why the quality of the interference pattern degrades as the "strength" of the weak measurement is increased.
 
I doubt randomness actually exists in nature.
It's more of a side product of the HUP, it's not ACTUAL randomness, although some crazy people do believe true randomness exists, actually quite a lot of otherwise smart people do so.
 
Fyzix said:
I doubt randomness actually exists in nature.
It's more of a side product of the HUP, it's not ACTUAL randomness, although some crazy people do believe true randomness exists, actually quite a lot of otherwise smart people do so.

I don't understand what you mean ... what is "ACTUAL randomness"? Are you saying you agree with the Bohmian hypothesis that everything is deterministic, but that we can never know the initial conditions precisely enough to make predictions? I think that is the Bohmian view on the HUP ... i.e. that it restricts how well we can know the initial conditions for any quantum system, so the results of experiments appear probabilistic.
 
SpectraCat said:
No. The "randomness" you are referring to is a fundamental limit imposed by the HUP. That is why the quality of the interference pattern degrades as the "strength" of the weak measurement is increased.

i see your point.

Question: when we try to get which-way info (we cause de-coherence, we create phase difference), do we increase or decrease randomness or does it remain the same?
 
Last edited:
SpectraCat said:
I don't understand what you mean ... what is "ACTUAL randomness"? Are you saying you agree with the Bohmian hypothesis that everything is deterministic, but that we can never know the initial conditions precisely enough to make predictions? I think that is the Bohmian view on the HUP ... i.e. that it restricts how well we can know the initial conditions for any quantum system, so the results of experiments appear probabilistic.

Yes, I am as certain as a human can be that the universe is 100% deterministic, from big bang to now.
Every single particle is deterministic.
I just don't see how reality could be any other way.
If randomness were truly part of quantum theory, I doubt we could even get probabilistics out of it.

Think about it, if something was RANDOM, how could we ever predict ANYTHING, even somethings probability?

I'm not sure whether Bohm is correct, or some other hidden variable interpretation (gerard 't hooft is working on this) or perhaps some brand new physics will be discovered that will shed light on the issue, but what I am sure of is that realism and determinism/causality is going to be a part of whatever the truth is.
 
Fyzix said:
Yes, I am as certain as a human can be that the universe is 100% deterministic, from big bang to now.
Every single particle is deterministic.
I just don't see how reality could be any other way...

Nice that you are so certain when there is no evidence to support your view whatsoever. And I do mean none.

Going back to Hume's work on causality, folks have continually assumed that which they are trying to prove. He did that explicitly. And you too, for example:

I wonder: was the Beatles' work inevitable? Because I can't imagine a world without their music. QED! :smile:
 
DrChinese said:
Going back to Hume's work on causality, folks have continually assumed that which they are trying to prove. He did that explicitly.
Are you talking about David Hume? If you do, then let me quote his famous words very applicable here:
"If we take in our hand any volume; of divinity or school metaphysics, for instance; let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning, concerning matter of fact and existence? No. Commit it then to flames: for it can contain nothing but sophistry and illusion."
 
Demystifier said:
Are you talking about David Hume? If you do, then let me quote his famous words very applicable here:
"If we take in our hand any volume; of divinity or school metaphysics, for instance; let us ask, Does it contain any abstract reasoning concerning quantity or number? No. Does it contain any experimental reasoning, concerning matter of fact and existence? No. Commit it then to flames: for it can contain nothing but sophistry and illusion."

"THOUGH there be no such thing as Chance in the world; our ignorance of the real cause of any event has the same influence on the understanding, and begets a like species of belief or opinion."

-Hume
 
  • #10
Fyzix said:
Yes, I am as certain as a human can be that the universe is 100% deterministic, from big bang to now.
Every single particle is deterministic.
I just don't see how reality could be any other way.

I often wonder why people believe the universe is deterministic. Not only, as DrChinese says, is there no evidence for this, but our everyday experience doesn't support this view. It certainly feels like I have free will - I can choose whether to type an A or a B as the next character - here goes - B. Do you really believe that my decision to type a B was pre-ordained at the moment of the Big Bang? If there were objective evidence that the universe were deterministic, then I could understand believing this in spite of the evidence of my senses, but with the evidence of my senses telling me that the universe is not pre-determined, why would I choose to believe that it is?
 
  • #11
What's more, I would add that both "random" and "deterministic" are always attributes of a theory or model. Science has no way to even discuss whether or not these are attributes of the real world, and I would add that a seasoned view of what science actually does warns us against extrapolating the nature of theories to the nature of reality. If the history of science has taught us nothing else, let it teach us that.
 
  • #12
DrChinese said:
"THOUGH there be no such thing as Chance in the world; our ignorance of the real cause of any event has the same influence on the understanding, and begets a like species of belief or opinion."

-Hume
Though IIRC, he wasn't out to prove determinism there (the futility of such an attempt, I imagine, would be something Hume of all people would have been most acutely aware of), he merely assumed it, the alternative making no sense to him. And it is a difficult thing to wrap one's head around, to the point of appearing almost self-contradictory.

One might, for instance, quite reasonably assume that everything that happens has a way by which it does so, some mechanism through which it occurs -- but if that's the case, then randomness seems nonsense: for any indeterminate choice between alternatives A and B, either some mechanism chooses A over B (say) -- but then, obviously, that mechanism provides for determinism; or, neither A nor B gets chosen -- but then, neither happens. So (if indeterminism is real) one must renounce the assumption that everything that happens has a mechanism through which it happens, so that without a choice being made, A (or B) is chosen.

Smarter people than me have asserted that such a thing is perfectly possible, that things can 'just happen', without sufficient cause, but to me, that's kinda like an underdetermined system of equations having an unique, and right, solution -- the information just isn't there --, and furthermore, if things just happen, then why bother with this whole physics stuff at all? If ultimately everything turns out to be the way it is because that's how it happens to be, then it seems we should cut to the chase and throw in the towel.

But luckily, quantum mechanics is perfectly compatible with determinism -- so one might argue that parsimony tilts the scales heavily in favour of such an interpretation (since there is otherwise no indication that indeterminism is possible at all).

And, so as not to be completely off-topic, there is of course no experiment that can decide between different interpretations of QM -- otherwise, they wouldn't be different interpretations, but rather, different theories.
 
  • #13
phyzguy said:
I often wonder why people believe the universe is deterministic. Not only, as DrChinese says, is there no evidence for this, but our everyday experience doesn't support this view. It certainly feels like I have free will - I can choose whether to type an A or a B as the next character - here goes - B. Do you really believe that my decision to type a B was pre-ordained at the moment of the Big Bang? If there were objective evidence that the universe were deterministic, then I could understand believing this in spite of the evidence of my senses, but with the evidence of my senses telling me that the universe is not pre-determined, why would I choose to believe that it is?
The argument for (human) free will does not require inherent/quantum randomness.

Thought I believe in both: human free will and inherent/quantum-level randomness. But neither is a precondition for supporting other.
 
  • #14
Lately I've been enjoying the deterministic perspective (probably, it seems more 'relaxing' ;) ) but I think the concept of 'true randomness' sits just fine with many people. You could claim that the universe is a balance between a Pattern that is ordered/understandable/predictable and an element of Chaos that is fundamentally 'irrational'/unpredictable/arbitrary. After all, that whole concept is embedded in various mythologies which is a sign it holds some deep intuitive appeal.

And regarding 'throwing in the towel': well, science is never going to answer the question 'why does anything whatsoever exist at all' (i.e. why do 'laws of physics' exist, why is there a 'reality' at all). So we are always going to be stuck with an element of 'arbitrariness' in the ultimate axioms of our TOE, "things just happen to be this way".
 
  • #15
Well weak measurement plays an important conceptual role in Aharnov's time-symmetric interpretation of quantum mechanics which is an interpretation which does away with the probabilistic nature of quantum mechanics.
 
  • #16
I love how people say "determinism doesn't fit our everyday life", YES IT DOES.
More than anything.
EVERYTHING in the classical realm has been shown to be 100% deterministic, in other words, ANYTHING you ever experience was 100% detemined.

Now DrChinese I know you got some weird views on QM so I won't even bother.

I never said I could prove it, because it can't be proven.
Neither can randomness.

However, everything we ever assumed was random, turned out to be deterministic, most likely this is so at the quantum scale too.
Just because we, humans, can't access it doesn't mean anything.
 
  • #17
phyzguy said:
I often wonder why people believe the universe is deterministic. Not only, as DrChinese says, is there no evidence for this, but our everyday experience doesn't support this view. It certainly feels like I have free will - I can choose whether to type an A or a B as the next character - here goes - B. Do you really believe that my decision to type a B was pre-ordained at the moment of the Big Bang? If there were objective evidence that the universe were deterministic, then I could understand believing this in spite of the evidence of my senses, but with the evidence of my senses telling me that the universe is not pre-determined, why would I choose to believe that it is?

Ofcourse you don't got free will, what the hell, do science minded people still believe in this illusion in 2011?!
That's beyond belief...

It also feels like colors are objective.
If you cut someones arm off, they can often experience "phantom sensations", experiencing that they have a hand that isn't there.
Just because you FEEL that you make the choice free willingly doens't prove anything.
Actually I'm pretty sure they have already proven that you have no choice through measuring brain activity.
 
  • #18
Fyzix said:
Now DrChinese I know you got some weird views on QM so I won't even bother.

Weird views, sure: I believe it.

(By the way, that would be like the pot calling the kettle...)

:smile:
 
  • #19
Fyzix said:
Just because you FEEL that you make the choice free willingly doens't prove anything.
Actually I'm pretty sure they have already proven that you have no choice through measuring brain activity.
No, they haven't proven any such thing. What has been shown and seems pretty reliable is that part of your brain is modeling itself, so when you make a mental decision, part of your brain has the job of "telling the story" of how you made that choice. This storytelling comes after the decision, which can easily be mistaken for evidence there wasn't a decision, but the truth is we still have no real idea of how to test for the difference between a "true free choice" and a "predetermined choice", and many people think those two things are apples and oranges that can easily coexist. But that gets into neuroscience and philosophy, not really relevant to "weak measurement" so we should probably just nip off the whole "free will" detour.
 
  • #20
Fyzix said:
EVERYTHING in the classical realm has been shown to be 100% deterministic, in other words, ANYTHING you ever experience was 100% detemined.

Really?

If I watch single photons hitting a detector, or listen to a Geiger counter clicking, has it been shown that I'm experiencing something 100% determined?

What about random mutations in DNA, or even weather patterns do to chaos principles?

I would have thought those examples come back to the unresolved debate between different QM interpretations.
 
  • #21
rgmcc said:
Really?

If I watch single photons hitting a detector
Classical?

Geiger counter clicking, has it been shown that I'm experiencing something 100% determined?
Radioactive decay is classical?

What about random mutations in DNA, or even weather patterns do to chaos principles?

mutations in DNA, I'm pretty sure this is outside of the quantum realm and thus classical yea, so yea.
I would have thought those examples come back to the unresolved debate between different QM interpretations.

Yupp, just like my argument says...
At the end of the day, the quantum raelm is at the bottom.

However throughout history we always thought things were random, then we investigate, find it's not random, then there is a new level we think is random, then it isn't etc etc etc etc etc.
Then we hit quantum mechanics and people somehow believe this is different...
 
  • #22
Actually, here is a different way to present your exact same argument. Throughout history, whenever we thought something was deterministic, we found a deeper level that was random. Did you really just argue that this means the people who think there will always be inherent randomness are ignoring the lessons of history?
 
  • #23
Fyzix said:
EVERYTHING in the classical realm has been shown to be 100% deterministic, in other words, ANYTHING you ever experience was 100% detemined.

My point was:

Clicks of Geiger counter are considered 'quantum random'. Clicks of a Geiger counter have not been _shown_ to be 100% deterministic (it is a matter of unresolved competing QM interpretations). Also, clicks of a Geiger counter are something I can experience.

Therefore it is not true that 'anything I can experience was 100% determined'.

Is there a flaw in what I'm saying?
 
  • #24
Do you consider thermal statistics determinate? in brownian motion, all particles take deterministic path but they are seen to be random. Was this why Einstein proposed his statistical interpretation and was it about determinism? About clicks in geiger counter being random. If one plots the distribution in time. It is no longer random. So maybe by linking it with spaceTIME. Determinism is the ultimate result as the probability distribution is only arranged in time from past present and future. Is this possible?

Bottomline is. By taking time as illusion, determinism is retained?
 
  • #25
Varon said:
Do you consider thermal statistics determinate? in brownian motion, all particles take deterministic path but they are seen to be random.
Have you heard about chaos theory? Determinism is a mathematical concept that applies to mathematical models. Real systems show similar behavior to chaotic deterministic mathematical systems, but that does not mean the real systems are deterministic, again because the way they are deterministic in the mathematical models simply does not apply to real world systems. In experimental physics, unlike mathematics, determinism can only be equated with "predictable", because experimental physics has no capability to define any other concept of determinism. And what we know is, in experimental physics, Brownian motion is not predictable, so cannot be called deterministic. Mathematical models of Brownian motion can be called deterministic, but their deterministic character, because of chaos, is part of what we can never match to observations. Hence in regard to determinism, we can never know if the mathematical models correctly reflect the true nature of the reality, or not. Indeed, if we hold to the deterministic belief, we end up with the rather absurd conclusion that butterflies can cause tornadoes, even though a flapped wing never changes the statistical tendencies of any weather patterns.

About clicks in geiger counter being random. If one plots the distribution in time. It is no longer random.
I'm not sure where you got that idea, but it's going to come as a pretty big surprise to a lot of experimental physicists who routinely model their data using Poisson (random) statistics.
 
  • #26
Ken G said:
Have you heard about chaos theory? Determinism is a mathematical concept that applies to mathematical models. Real systems show similar behavior to chaotic deterministic mathematical systems, but that does not mean the real systems are deterministic, again because the way they are deterministic in the mathematical models simply does not apply to real world systems. In experimental physics, unlike mathematics, determinism can only be equated with "predictable", because experimental physics has no capability to define any other concept of determinism. And what we know is, in experimental physics, Brownian motion is not predictable, so cannot be called deterministic. Mathematical models of Brownian motion can be called deterministic, but their deterministic character, because of chaos, is part of what we can never match to observations. Hence in regard to determinism, we can never know if the mathematical models correctly reflect the true nature of the reality, or not. Indeed, if we hold to the deterministic belief, we end up with the rather absurd conclusion that butterflies can cause tornadoes, even though a flapped wing never changes the statistical tendencies of any weather patterns.

I'm not sure where you got that idea, but it's going to come as a pretty big surprise to a lot of experimental physicists who routinely model their data using Poisson (random) statistics.

Supposed there was no true quantum randomness. Can it be already predicted 13.75 billion ago at the time of the Big Bang that a certain hurricane would occur in Katrina 13.75 billion years later in a certain part of a globe called Earth in a certain part of a galaxy in a certain part of the universe. Is this determinism possible at all if there is no true quantum randomness.. or can classical world produce true randomness such that the future is not written in stone? If so, what is this theorem called?
 
  • #27
Classical chaos would have made that prediction impossible with any reasonable uncertainty in a measured initial condition. But here we come to the crux of the problem-- is physics inherently a prescription for our intelligence to link initial measurements to final ones, which means it is a theory with uncertainties built into it, and "deterministic" means "predictable", or is it some deeper truth that the universe itself follows, such that we can imagine "perfect accuracy" in the initial conditions, a kind of "Platonic ideal" that measurements are trying to access to closer and closer precision? This is the key question, what we think physics actually is. This is also where the quantum interpretations differ, not just on what the wave function is, but what physics is.
 
  • #28
Weak measurements don't necessarily prove or disprove that randomness is inherent because:

Weak measurements are not effecting randomness.

Weak measurements change the coherence "partially" (or increase de-coherence partially) and hence result in "partial" interference pattern.

Even when we have "complete" which-way information (or complete eraser for that matter) the randomness persists and does not increase or decrease. however the state changes (i.e. degree of coherence increases/decreases)

does the above sound correct?
 
  • #29
Ken G said:
Actually, here is a different way to present your exact same argument. Throughout history, whenever we thought something was deterministic, we found a deeper level that was random. Did you really just argue that this means the people who think there will always be inherent randomness are ignoring the lessons of history?

That argument works both ways. You can only find a new descriptive level to reintroduce randomness when a deterministic model is found for what came before. In fact classical thermodynamics was thought to be fundamental and random till the equivalence of statistical mechanics was demonstrated, and even demonstrated superior with Brownian motion.

@Fyzix
I have debated DrChinese and his views are NOT weird. What is weird is to proclaim that which the preponderance of evidence is against is somehow intellectually superior, and this is coming from a person who disagrees with DrChinese. Having an opinion and proclaiming that opinion is superior in absentia of evidence is two different things, and DrChinese is ahead on the evidence count. Not by enough to fully convince me but the truth is the truth. Learn to live with it or live without science.
 
  • #30
my_wan said:
That argument works both ways. You can only find a new descriptive level to reintroduce randomness when a deterministic model is found for what came before. In fact classical thermodynamics was thought to be fundamental and random till the equivalence of statistical mechanics was demonstrated, and even demonstrated superior with Brownian motion.
I'm not entirely clear what you are saying here, because I would have said that classical thermodynamics is the deterministic theory, and statistical mechanics is the random one. For example, thermodynamics uses variables like temperature that are supposed to mean something specific, whereas statistical mechanics uses ensemble averages that are really just mean values. So I would interpret the discovery that statistical mechanics can derive the theorems of thermodynamics to be a classic example of how randomness is continually found to underpin theories that we initially thought were deterministic. Quantum mechanical trajectories would be another prime example, as would chaos theory in weather.
 
  • #31
I want to put forth a challenge to the crazy people who actually manage to believe in randomness:

Define randomness?

Causation is easily defined.

So what does "random" REALLY mean?
When we use "random" in everyday speech, we are only talking about things we personally couldn't predict.

Like throwing a dice, if we knew all the variables we would know exactly what it would turn out as...

So please WHAT is randomness? how could it even exist?
I don't think that its' even possible to define randomness, things would just "randomly" happen without ANY cause what so ever, never been observed, never will be observed, because it doesn't and couldn't exist.
 
  • #32
Fyzix said:
Define randomness?

“If the state y of a system at time t is uniquely defined by its state x at an arbitrary moment t0 through a unique function f such that y=f(x, t0, t), situations of this general type is called schemes of a well-determined process. On the contrary, if the state x at time t0 only determines a probability distribution for the possible future state y, these are called schemes of a stochastically definite process”, taken from von Plato, J. (1994). Creating Modern Probability: Its Mathematics, Physics, and Philosophy in Historical Perspective, Cambridge University Press.

These stochastically definite processes are basically Kolmogorov's definition of randomness.


Fyzix said:
So what does "random" REALLY mean?
When we use "random" in everyday speech, we are only talking about things we personally couldn't predict.

Like throwing a dice, if we knew all the variables we would know exactly what it would turn out as...

This is not randomness. This is unpredictability.

Fyzix said:
So please WHAT is randomness? how could it even exist?
I don't think that its' even possible to define randomness, things would just "randomly" happen without ANY cause what so ever, never been observed, never will be observed, because it doesn't and couldn't exist.

Things happening without cause are arbitrary, not random. In randomness the link between cause and effect(s) is not unambiguous.
 
  • #33
Fyzix said:
I want to put forth a challenge to the crazy people who actually manage to believe in randomness:

Define randomness?

Causation is easily defined.

So what does "random" REALLY mean?
When we use "random" in everyday speech, we are only talking about things we personally couldn't predict.

Like throwing a dice, if we knew all the variables we would know exactly what it would turn out as...

So please WHAT is randomness? how could it even exist?
I don't think that its' even possible to define randomness, things would just "randomly" happen without ANY cause what so ever, never been observed, never will be observed, because it doesn't and couldn't exist.

I would define true randomness as something that could not be predicted be anyone or anything.

A simple example would be pair production from vacuum. We know that it can happen, but no one can predict when it will happen. It can happen at one particular time, and this exact time is completely without cause. There is no reason why it would happen at one particular time compared to some time later. Similar things, like spontaneous decay, has been observed many times in the lab...

Just because you don't like the notion of randomness doesn't mean it isn't there. Of course, there may still be a deterministic explanation for all this, but so far this has not been found/proven and you simply cannot claim that without proof. So far, experiments are entirely consistent with the possibility of a random world.
 
  • #34
To the nice points already made about randomness, I would add one more. I believe Fyzix's main issue with the concept of randomness is not the mathematical entity, but rather its application to the real world. He/she does not see how the real world can "really be random". There is good company for that unease, Einstein famously shared it.

But to help relieve that unease, I would point out that "randomness" is never an attribute of the real world, it is always an attribute of a mathematical model of the real world. This fact points the philosopher in two very different possible directions-- one can believe that underneath all randomness is some deterministic process that is unknown to us (obviously Fyzik's favored course, as well as deBroglie-Bohm for example), or alternatively one can embrace the randomness as fundamental in the reality as much as it is fundamental in the mathematical model of reality.

But I would suggest a third course. Simply take no stance at all on the issue-- we need no such stance, because all we ever get is our models, and physics is demonstrably not a study of the real world by using models, it is a study of models of the real world by testing to what extent they serve our purposes. In that light, neither the concept "random", nor the concept "deterministic", can ever be applied unambiguously to the actual workings of the real world.
 
  • #35
Fyzix said:
I don't think that its' even possible to define randomness, things would just "randomly" happen without ANY cause what so ever, never been observed, never will be observed, because it doesn't and couldn't exist.

Fyzik - How can you make this statement?? When a uranium nucleus, which has been happily sitting there for 4 billion years, suddenly decays, why couldn't this be something that 'just happened'? It MAY be that there is some underlying mechanism that caused it to decay, but there is no evidence for this, and no successful model has been proposed to explain this as the result of some underlying deterministic model. So how can you say that something that 'just happens' has never been observed? How can you be so certain that there is not randomness inherent in the operation of the universe, when the evidence suggests very strongly that there is?
 
  • #36
Actually, I would point out that the decay of the uranium is not actually evidence of randomness "in the operation of the universe." I agree with your main point, that it requires considerable suspension of disbelief to say that the decay is deterministic in the absence of any evidence that it is, but we don't have an either/or situation. We often see the fallacy that "if it isn't random, it must be deterministic, and if I see no evidence that it is deterministic, it must be random." Randomness and determinism are both elements of models we use to describe the operation of the universe, but they are never elements of the operation of the universe. Scientists can only test the success of our models by comparing to the outcomes of experiment. The tests of the operation of the universe are the experiments themselves, not the success of the models-- that's something different.
 
  • #37
Ken G said:
Actually, here is a different way to present your exact same argument. Throughout history, whenever we thought something was deterministic, we found a deeper level that was random. Did you really just argue that this means the people who think there will always be inherent randomness are ignoring the lessons of history?

Great point. I believe this is accurate in physical science. As we factor in new variables (to get more accurate results), it becomes harder and harder to cite anyone as the "cause" of the result. And somehow, a new level of indeterminacy firmly creeps in. We used to think of that as relating to "initial conditions" but it doesn't appear that way any longer (at least to me). I don't think the human brain is a deterministic machine either.
 
  • #38
my_wan said:
@Fyzix
I have debated DrChinese and his views are NOT weird. ...

Although there are a few questions about my taste in clothes. :smile:
 
  • #39
My problem with randomness is the TRUE randomness yes, not the "mathematical randomness" / lack of knowledge on the human part.
That's exactly what I am arguing.

Someone mentioned an exampe of a uranium atom decaying after 4 billion years, SOMETHING must cause it.
I can't see any other way around it.
It decaying itself is a mechanism! it's just ignorant of humans to think we already understand enough to say "hey randomness exists, because we don't know everything yet".

People would say the same about EVERYTHING 300 years ago.
 
  • #40
So we have encountered two logical fallacies:
1) saying that because we don't know what causes something means it is uncaused
(that's called "argument from ignorance")
2) saying that because we cannot imagine something isn't caused means it must be caused
(that's called "argument from incredulity")
Scientific thinking should always avoid logical fallacies, and that's exactly why we must be clear on the difference between the features of our models, and how successful they are when compared with experiment, versus the features of whatever is making the experiments come out the way they do. The only way to avoid fallacies is to be very clear with ourselves what we are really doing when we enter into scientific thought.
 
  • #41
Randomness exists by Occam's razor.

In Fyzix's world, any event must have a preceding event that caused it, how does the causal mechanism work between two events?

In a random world, any event doesn't need a preceding event to cause it, so we don't need to explain anything further.

However, we know there is some order in the world, so we ought to impose some constraints on the randomness (to explain the world), eg we could insist that the randomness is guided by an evolution equation, like Schrödinger's equation for example.

So we have deterministic evolution of probabilistic states.

There, that's the world.
 
  • #42
Fyzix said:
My problem with randomness is the TRUE randomness yes, not the "mathematical randomness" / lack of knowledge on the human part.
That's exactly what I am arguing.

Someone mentioned an exampe of a uranium atom decaying after 4 billion years, SOMETHING must cause it.
I can't see any other way around it.
It decaying itself is a mechanism! it's just ignorant of humans to think we already understand enough to say "hey randomness exists, because we don't know everything yet".

People would say the same about EVERYTHING 300 years ago.

This is a PHYSICS forum, of course we're concerned with mathematical randomness. I'm getting the strong sense that you don't really have any physics background. Is this the case? The fact is, like it or not, there is an enormous amount of evidence against a deterministic universe and assuming a probabilistic universe has given us the most accurate (in predicting reality) mathematical model ever created. It is from this understanding that we invented the transistor (i.e. the microchip), the laser, modern chemistry, etc. Furthermore, if quantum mechanics were wrong (or just an effective theory of a more general higher order one) and there were a deeper deterministic theory we still have some very strict mathematical limitations on what that deterministic theory must look like and it would have to break a whole lot of rules that every experiment tells us are correct (for example, a deterministic theory CANNOT be local, but locality seems very much to be an inextricable part of reality).

You must realize that physics and quantum mechanics are a SCIENCE, baseless philosophical pondering devoid of actual knowledge of physics is worthless. Physics is applied math, if you don't understand that math then you can't possible understand the issues. Not liking an extraordinarily accurate theory doesn't mean a thing unless you've got a more accurate theory to supersede it.

Also, one could of course easily make quantum randomness an aspect of the macroscopic world. Take a cathode ray tube (which we'll say sends out only 1 electron at a time), pass the electron through an Sz Stern-Gerlach machine, take the output and put it through an Sx one, take the output and pass it through an Sz again. If it comes out spin up, cleave a random person's head off with an ax, if it comes out spin down, don't. Wham! Real world consequences of quantum randomness ;)
 
  • #43
Ken G said:
I'm not entirely clear what you are saying here, because I would have said that classical thermodynamics is the deterministic theory, and statistical mechanics is the random one. For example, thermodynamics uses variables like temperature that are supposed to mean something specific, whereas statistical mechanics uses ensemble averages that are really just mean values. So I would interpret the discovery that statistical mechanics can derive the theorems of thermodynamics to be a classic example of how randomness is continually found to underpin theories that we initially thought were deterministic. Quantum mechanical trajectories would be another prime example, as would chaos theory in weather.

First classical thermodynamics is formulated as a set laws of what was then considered fundamental laws. Statistical mechanics developed later (read 'the statistics of mechanics') was developed later and from which the laws of thermodynamics were found to be derivable from. Statistical mechanics is essentially the kinetic theory of gases.
[url]http://www.wolframscience.com/reference/notes/1019b[/url] said:
The idea that gases consist of molecules in motion had been discussed in some detail by Daniel Bernoulli in 1738, but had fallen out of favor, and was revived by Clausius in 1857. Following this, James Clerk Maxwell in 1860 derived from the mechanics of individual molecular collisions the expected distribution of molecular speeds in a gas.

This kicked of a controversy because:
[url]http://www.wolframscience.com/reference/notes/1019b[/url] said:
At first, it seemed that Boltzmann had successfully proved the Second Law. But then it was noticed that since molecular collisions were assumed reversible, his derivation could be run in reverse, and would then imply the opposite of the Second Law.

To continue the above quote does this look familiar in todays context?
[url]http://www.wolframscience.com/reference/notes/1019b[/url] said:
Much later it was realized that Boltzmann’s original equation implicitly assumed that molecules are uncorrelated before each collision, but not afterwards, thereby introducing a fundamental asymmetry in time. Early in the 1870s Maxwell and Kelvin appear to have already understood that the Second Law could not formally be derived from microscopic physics, but must somehow be a consequence of human inability to track large numbers of molecules. In responding to objections concerning reversibility Boltzmann realized around 1876 that in a gas there are many more states that seem random than seem orderly. This realization led him to argue that entropy must be proportional to the logarithm of the number of possible states of a system, and to formulate ideas about ergodicity.

Gibbs developed the Gibbs ensemble construction around 1900, providing a more general formal foundation for the whole thing. A few years later (1905) Brownian motion put the final seal on statistical mechanics based on papers over the last 25 years.

Yet here is another interesting and funny bit. The formal definition of Gibbs ensembles define the fundamental bits of the QM formalism on which the many worlds hypothesis was constructed. The many worlds hypothesis is basically the result of postulating every copy of a Gibbs ensembles is existentially real. Hence the many worlds are the Gibbs ensembles.

The only place randomness survives in the theoretically 'pure' form is in subatomic physics.
 
Last edited:
  • #44
my_wan said:
The only place randomness survives in the theoretically 'pure' form is in subatomic physics.

If by SUBatomic you mean atomic then I suppose. Though I'd ultimately disagree. Statistical mechanics is simply IMPLICITLY "random", yet it is still random. For example, Fermi-Dirac statistics are founded on the Pauli Exclusion Principle. However, the exclusion principle is a direct result of the indistinguishability of particles and Born's rule. Both of these EXPLICITLY relate to the blurred out, probabilistic core of quantum mechanics and the Schrodinger equation. Thus, by taking Pauli Exclusion as axiom, statistical mechanics inherits the underlying assumption of "randomness" even if the behaviour of large ensembles ends up being deterministic. I'd imagine this is particularly obvious in the Mesoscopic regime.

Also, FYI I believe thermodynamics was always a phenomological theory (as opposed to a fundamental one). It was developed around the same time as E&M and I think the notion of an atom was gaining a little bit of traction. The notion that there was ultimately some "under the hood" electromagnetic interaction driving the whole thing was likely in the air. Tragically, Boltzmann committed suicide after his atomistic reduction of thermodynamics continually faced derision.
 
  • #45
unusualname said:
Randomness exists by Occam's razor.
Occam's razor is a technique for deciding on the most parsimonious way to think about reality. It is not a way to establish "what exists." A very wrong way that many people understand Occam's razor is "the simplest explanation is most likely the correct one."
That is wrong for at least two reasons:
1) Occam's razor is a way to choose between theories, not a way to dictate how reality works, and
2) the statement is patently false, contradicted over and over in a wide array of scientific examples.
So the correct way to state Occam's razor is: "since our goal is to understand, and since understanding involves simplification, the simplest theory that meets our needs is the best."
So if we take that correct statement of the razor, and parse your claim, it comes out "randomness exists because it is easier for us to understand randomness." That should expose the problem.

As for your argument that randomness is in fact a simpler description of many of the phenomena we see, including the decay of uranium, I agree.

So we have deterministic evolution of probabilistic states.

There, that's the world.
Correction, that's our simplest description of the world. Big difference. For one thing, you left out the most puzzling part of all-- how a deterministic evolution of probabilistic states gives way to particular outcomes.
 
  • #46
Ken G said:
Occam's razor is a technique for deciding on the most parsimonious way to think about reality. It is not a way to establish "what exists." A very wrong way that many people understand Occam's razor is "the simplest explanation is most likely the correct one."
That is wrong for at least two reasons:
1) Occam's razor is a way to choose between theories, not a way to dictate how reality works, and
2) the statement is patently false, contradicted over and over in a wide array of scientific examples.
So the correct way to state Occam's razor is: "since our goal is to understand, and since understanding involves simplification, the simplest theory that meets our needs is the best."
So if we take that correct statement of the razor, and parse your claim, it comes out "randomness exists because it is easier for us to understand randomness." That should expose the problem.

As for your argument that randomness is in fact a simpler description of many of the phenomena we see, including the decay of uranium, I agree.

Correction, that's our simplest description of the world. Big difference. For one thing, you left out the most puzzling part of all-- how a deterministic evolution of probabilistic states gives way to particular outcomes.

In the Consistent Histories interpretation this is not a problem, once we have a measurement we can know how the probabilities evolved. There is no way to know this without making a measurement of course.

Also, constructing the Schrödinger evolution at the microscopic level is of course a huge problem, why all the linear group structures in the Standard Model? How does gravity emerge for such an evolution? And the big one - how does human free-will seem to enable us to further guide this evolution beyond (afawk) what exists anywhere else in the universe?
 
Last edited:
  • #47
my_wan said:
First classical thermodynamics is formulated as a set laws of what was then considered fundamental laws. Statistical mechanics developed later (read 'the statistics of mechanics') was developed later and from which the laws of thermodynamics were found to be derivable from. Statistical mechanics is essentially the kinetic theory of gases.
All true, but that's why thermodynamics is the deterministic theory (heat flows from hot to cold, etc.) and statistical mechanics is the random (statistical) theory (heat is more likely to flow from hot to cold, etc.). So I would say this is an example of the natural tendency for seemingly deterministic laws to later be reinterpreted as emergent from more fundamentally stochastic laws.
To continue the above quote does this look familiar in todays context?
Yes, I too have noticed the appearance of physicist-as-participant-in-physics effects even in classical thermodynamics. It's there in relativity too. The idea that "observer effects" are purely quantum in nature is narrow-minded.
Yet here is another interesting and funny bit. The formal definition of Gibbs ensembles define the fundamental bits of the QM formalism on which the many worlds hypothesis was constructed. The many worlds hypothesis is basically the result of postulating every copy of a Gibbs ensembles is existentially real. Hence the many worlds are the Gibbs ensembles.
On another thread, I am making the point (to little favor, I might add) that many-worlds is a completely classical concept that picks up nothing particularly special in the quantum context. In both cases, it is only the fact that science has to address the sticky problem that a given observer gets a given observed outcome, that is the actual nature of the problem, not quantum vs. classical. I think you would be sympathetic to that view.
The only place randomness survives in the theoretically 'pure' form is in subatomic physics.
This is where we diverge. I don't think the problem is with the impurity of randomness, because I view all mental constructs (like randomness and determinism alike) as "impure." They are all effective theories, all models, and randomness is the model used in statistical processes like statistical mechanics. Including all the Gibbs ensembles really doesn't remove the need for randomness, because we don't get an ensemble when we do the experiment, we get an outcome. That's where the randomness concept connects most closely to reality, but it is still impure and incomplete, because we still have no idea why we get a particular outcome, when all our theories can only give us statistical distributions. This is a fundamental disconnect between physics and reality that cannot be resolved by imagining the universe is fundamentally random or fundamentally deterministic, because either idea can be made to work with sufficient suspension of disbelief, and anyway there's no reason to imagine the universe is "fundamentally" any of those things.
 
  • #48
Ken G said:
All true, but that's why thermodynamics is the deterministic theory (heat flows from hot to cold, etc.) and statistical mechanics is the random (statistical) theory (heat is more likely to flow from hot to cold, etc.). So I would say this is an example of the natural tendency for seemingly deterministic laws to later be reinterpreted as emergent from more fundamentally stochastic laws.
The manner in which you have defined "intrinsic" determinism in the context of thermodynamics is also shared by QM in the underlying wave equations. When thermodynamics was developed the laws were defined in irreversible form. Only when statistical mechanics was further developed it created problems for this assumption, written as law, that such processes were irreversible. Exactly because the real state of the system is defined not by ensembles, but by mechanistic certainties if the particular state each ensemble was actually in was known.

In this context stochastic laws are not fundamental to the system, they are only fundamental to our level of knowledge about the system. Thus saying "fundamentally stochastic laws" is a misnomer of what the physics actually entail, at least in this context.

Now obviously, it is quiet trivial to decompose Gibbs ensembles of a classical medium into distinct physical units. Yet QM is fundamentally quiet different in that respect. Even quantization involves properties rather than parts and do not stay put in any part-like picture ever conceived. Perhaps in the quantum regime "fundamentally" really does belong in front of "stochastic laws", but in thermodynamics it most certainly does not, as illustrated by statistical mechanics. In a classical regime stochastic is merely a consistently 'apparent' property resulting from a limitation in the completeness of our knowledge.

Now the big question. If we as observers have fundamental limits on our knowledge that physical law dictates we cannot 'empirically' get around by any means, would that constitute "fundamental" stochastic laws even if the theory entailed a complete lack of stochastic behavior at the foundational level? That is what we have in classical stochastic behavior, but QM lack a similar underlying mechanism that defines stochastic behavior as purely a product of limited knowledge. That is THE key difference between classical and Quantum mechanics. Saying "fundamentally stochastic laws" requires the presumption that a an ignorance of our ignorance is evidence of a lack of ignorance, i.e., "fundamental". Whereas classically we are aware of our ignorance such that in that context it is not fundamental to the system itself.

Ken G said:
Yes, I too have noticed the appearance of physicist-as-participant-in-physics effects even in classical thermodynamics. It's there in relativity too. The idea that "observer effects" are purely quantum in nature is narrow-minded.
Agreed. It is a whole range of these observations that leads me to assume it quiet likely that the conceptual problems in QM is not just ignorance, but an ignorance of our ignorance.

Ken G said:
On another thread, I am making the point (to little favor, I might add) that many-worlds is a completely classical concept that picks up nothing particularly special in the quantum context. In both cases, it is only the fact that science has to address the sticky problem that a given observer gets a given observed outcome, that is the actual nature of the problem, not quantum vs. classical. I think you would be sympathetic to that view.
It is quiet likely that I would. Maybe I will check it out shortly.

Ken G said:
This is where we diverge. I don't think the problem is with the impurity of randomness, because I view all mental constructs (like randomness and determinism alike) as "impure." They are all effective theories, all models, and randomness is the model used in statistical processes like statistical mechanics. Including all the Gibbs ensembles really doesn't remove the need for randomness, because we don't get an ensemble when we do the experiment, we get an outcome. That's where the randomness concept connects most closely to reality, but it is still impure and incomplete, because we still have no idea why we get a particular outcome, when all our theories can only give us statistical distributions. This is a fundamental disconnect between physics and reality that cannot be resolved by imagining the universe is fundamentally random or fundamentally deterministic, because either idea can be made to work with sufficient suspension of disbelief, and anyway there's no reason to imagine the universe is "fundamentally" any of those things.
The concept of randomness will in fact ALWAYS be needed in science. We can never have perfect knowledge about any system period. We cannot even write down that many decimal places to acquire such knowledge if it was possible. The key difference, that statistical mechanics illustrates, is that classically a perfect Maxwellian Demon could ONLY in principle do away with stochastic behavior altogether, but in QM we have no clue how to construct any model that would allow this Maxwellian Demon to do the same in that regime, even in principle.
 
  • #49
I believe that according to QFT, nuclear decay events are attributed to the same thing that "causes" spontaneous emission of radiation from excited quantum states, namely, interaction of the metastable quantum system with a vacuum fluctuation (or virtual photon, or spaghetti monster tears, or whatever name you want to give to the hypothetical phenomenon). Some sort of interaction is required within the framework of quantum theory for excited molecular or atomic eigenstates to decay, because they are *eigenstates*, and thus their probability density is conserved.

So, the question now is, are vacuum fluctuations (or whatever) truly random? I don't know enough about QFT or quantum cosmology to even approach answering that question. Personally, I have a strong predilection to believe that they are in fact random, but it's just a gut feeling at this point.
 
  • #50
The determinism of thermodynamics is in the structure of the theory itself. We can predict a deterministic evolution of temperature, for example, in thermodynamics, and first students of thermodynamics are generally not taught that this is just a statistical average they are solving. But quantum predictions are not framed deterministically, instead we speak of testing probability distributions explicitly in QM, via repetition of the same experiment-- a device never used in thermodynamics. In QM, we don't generally test expectation values, whereas in thermo, we are not even taught that the observables are expectation values (even though they are). So thermodynamics is a deterministic theory, and quantum mechanics isn't.
In this context stochastic laws are not fundamental to the system, they are only fundamental to our level of knowledge about the system. Thus saying "fundamentally stochastic laws" is a misnomer of what the physics actually entail, at least in this context.
I'm not sure what context you mean. I would place the "fundamental" aspects of a law in the nature of the derivations used for that law, not in the nature of the systems the law is used to predict. That's mixing two different things.
In a classical regime stochastic is merely a consistently 'apparent' property resulting from a limitation in the completeness of our knowledge.
We don't actually know that, because our knowledge is always limited. We have no way to test your assertion. Indeed, in classical chaos, we generally find the stochasticity penetrates to all levels-- no matter what the ignorance is initially, it rapidly expands toward ergodicity. This has a flavor of being more than an apparent aspect of the behavior, instead the behavior is a kind of ode to ignorance. The idea that we could ever complete our information of a classical system is untenable-- ironically, classical systems are far more unknowable than quantum systems, because classical systems have vastly many degrees of freedom. It is that vastness that allows us to mistake expectation values for deterministic behavior, we see determinism in the context where the behavior is least knowable. Determinism is thus a kind of "mental defense mechanism," I would say.
Now the big question. If we as observers have fundamental limits on our knowledge that physical law dictates we cannot 'empirically' get around by any means, would that constitute "fundamental" stochastic laws even if the theory entailed a complete lack of stochastic behavior at the foundational level?
The laws are the theory, so the foundation of the laws is only the structure of the theory, regardless of how successfully they test out. I think you take the perspective that there really are "laws", and our theories are kinds of provisional versions of those laws. My view is that the existence of actual laws is a category error-- the purpose of a law is not to be what nature is actually doing, it is to be a replacement for what nature is actually doing, a replacement that can fit in our heads and meet some limited experimental goals. I ask, what difference does it make the "foundational" structure of our laws? We never test their foundational structure, we only test how well they work on the limited empirical data we have at our disposal. The connection at the foundational level will always be a complete mystery, or a subject of personal philosophy, but what we know from the history of science is that the foundational level of any law is highly suspect.

That is what we have in classical stochastic behavior, but QM lack a similar underlying mechanism that defines stochastic behavior as purely a product of limited knowledge. That is THE key difference between classical and Quantum mechanics.
Yes, that is an important difference.
Saying "fundamentally stochastic laws" requires the presumption that a an ignorance of our ignorance is evidence of a lack of ignorance, i.e., "fundamental".
It is not the laws that are fundamental, because that makes a claim about their relationship to reality. It is only the fundamental of the law that we can talk about-- there's a big difference.
It is a whole range of these observations that leads me to assume it quiet likely that the conceptual problems in QM is not just ignorance, but an ignorance of our ignorance.
I think this is your key point here, the degree of ignorance is worse in QM applications. I concur, but then we are both Copenhagen sympathizers!
The concept of randomness will in fact ALWAYS be needed in science. We can never have perfect knowledge about any system period.
Yes, I agree that randomness in our models is inevitable-- chaos theory is another reason.


The key difference, that statistical mechanics illustrates, is that classically a perfect Maxwellian Demon could ONLY in principle do away with stochastic behavior altogether, but in QM we have no clue how to construct any model that would allow this Maxwellian Demon to do the same in that regime, even in principle.
Yes, I see what you mean, the absence of any concept of a quantum demon is very much a special attribute of quantum theory, although Bohmians might be able to embrace the concept.
 
Back
Top