- #1
vtmemo
- 7
- 0
Okay, here's the deal.
We all know the law of independent probability after a broken sequence. For example, if you were to flip a coin 100 times, the first 90 times coming up heads and the last one tails. We all know that each flip is 'independent' of the last flip so long as they are taken as an individual occurence.
However, the odds of flipping a coin Heads 90 times in a row is relatively slim, thus making the odds of a Tails flip on the last one seem bigger. It's like saying "the odds against this streak occurring are high, so there is a higher probability that a flip will occur that breaks the run than one that continues it."
Here comes the big question: is quantum statistics self-defeating? That is, does observation on a quantum level change that which we observe because of sequential laws governing statistics?
For example, let's say there's an almost-infinitely improbable event that we wish to observe. Does the fact that we are observing every passing second of that event "not" occurring increase the probability of it actually occurring? I guess another way to say the idea would be:
"As the probability of something occurring approaches zero on a quantum level, the probability of it *actually* occurring under observation increases, in the sense that it will occur under observation AT ALL"
I dunno. Feels kinda like a Murphy's Law of event-related statistics, or some sort of "you can't observe it without changing it" theory.
I got bored at work
We all know the law of independent probability after a broken sequence. For example, if you were to flip a coin 100 times, the first 90 times coming up heads and the last one tails. We all know that each flip is 'independent' of the last flip so long as they are taken as an individual occurence.
However, the odds of flipping a coin Heads 90 times in a row is relatively slim, thus making the odds of a Tails flip on the last one seem bigger. It's like saying "the odds against this streak occurring are high, so there is a higher probability that a flip will occur that breaks the run than one that continues it."
Here comes the big question: is quantum statistics self-defeating? That is, does observation on a quantum level change that which we observe because of sequential laws governing statistics?
For example, let's say there's an almost-infinitely improbable event that we wish to observe. Does the fact that we are observing every passing second of that event "not" occurring increase the probability of it actually occurring? I guess another way to say the idea would be:
"As the probability of something occurring approaches zero on a quantum level, the probability of it *actually* occurring under observation increases, in the sense that it will occur under observation AT ALL"
I dunno. Feels kinda like a Murphy's Law of event-related statistics, or some sort of "you can't observe it without changing it" theory.
I got bored at work