# What is random?

1. Apr 8, 2014

### sina89

maybe this is too basic question but it is not so clear for me. when we refer to a random experiment, can a phenomenon be absolutely random by itself or its all about our uncertainty about the outcome that we call it random?

2. Apr 8, 2014

### chogg

That's actually a fairly deep question in philosophy of science. Educated opinions differ on the matter.

Fortunately, the math of probability theory works out the same way in either case, which allows scientists to make progress without necessarily agreeing about what goes on under the hood. :-)

3. Apr 8, 2014

### mpresic

This is not a easy question at all. I read a story somewhere (I do not have the reference) where astronomers needed to select a random star. First, one astronomer proposed to use the computer to generate a random point (ascension and declination). Then look at a sky map to find the nearest star to that point. One would think that would generate a random star. Another astronomer thought to label all the stars from 1 to 1 trillion (or so). Get the computer to generate a random number and the star corresponding to this point would be the random star. A third astronomer chose to do some other procedure (I do not remember).

The point is these are three different algorithms which will lead to far different results. Another illustration from probability textbooks (e.g. Papoulis, Probability, Random Variables, and Stochastic Processes) is labeled the Buffon's needle problem. Three different ways to implement "random" lead to three different probabilities (1/4, or 1/3, or 1/2) the "random" chord generated on a circle has a greater length than the side of a equilateral triangle inscribed within that circle. All three answers can be interpreted as correct.

4. Apr 8, 2014

### Stephen Tashi

In mathematics, a random event is defined in the context of a population of events and an assignment of probabilities to events in this population. When people toss off phrases like "Is the weather random?" or "What is the probability that I go to the kitchen", they aren't asking a specific mathematical question because they haven't defined the population of events.

If you want to discuss randomness in a vague, general sense perhaps you should post in the section of the forum called "General Discussions".

5. Apr 8, 2014

### chogg

OP: are you asking whether the universe is "ultimately" determinsitic vs. stochastic?

If so, there is some interesting discussion on a (closed) thread here:
The parts I find most interesting are on Page 2.

I still hold the basic position I explained in that thread; we simply do not know whether the universe is ultimately stochastic or deterministic.

6. Apr 8, 2014

### sina89

Thank all of you for reply. im new here and i didn't know where to post my question. i wanted to what do people exactly mean when they use the word RANDOM in physics.

7. Apr 8, 2014

### chogg

The moral of this story: different people mean different things, and if it makes a difference, you should ask them to clarify. :)

8. Apr 12, 2014

### mpresic

I probably should correct my earlier thread. The problem giving ambiguous answers to the probability of a random chord greater than a corresponding side of the inscribed equilateral triangle is known as Bertrand's paradox, not Buffon's needle problem.

9. Apr 12, 2014

### FactChecker

In most applications of "random" probability it is theoretically possible, but just impractical, to know enough to predict outcomes. Einstein refused to accept that anything was intrinsically random. He said "God doesn't play dice". But he was proven wrong at the quantum theory level. Information theory tries to reconcile these facts.

Here are three distinct examples.
1) An experiment that has not been done, and whose outcome is unpredictable with current information is "random".
2) An experiment that has already been done but we do not know the result (a coin toss whose result is hidden) is clearly already determined. There is nothing "random" about the result since it is already determined, but there is still uncertainty about what we should guess. This is clearly an information problem. New information can change what we should guess about the result (Bayesian statistics).
3) In quantum theory there are experiments that can quantitatively distinguish between something that is already determined, but we do not know the result, versus something that is really undetermined (still in several states simultaneously). Those experiments have shown that there really are examples that are intrinsically random. It is called a "superposition of states" when something exists but its state is not fixed. (I hope I am not butchering this)

Last edited: Apr 12, 2014
10. Apr 13, 2014

### chogg

I guess it depends what you mean by "fixed". If you mean it's in a well-defined state, I'd say a superposition of states is a fixed state. It's just a state where your experiment's result can't be predicted ahead of time.

For example: say you're doing an experiment to measure $S_z$, the $z$-component of spin, for a spin-$\frac{1}{2}$ particle. If your system is in a state $\frac{1}{\sqrt{2}}\left(|z_+\rangle + |z_-\rangle\right)$, you'll get either outcome with 50% probability. But that doesn't mean it doesn't have a state; in fact, its state is $|x_+\rangle$ (for suitable choice of $x$-axis ;-).

Contrast this with the case of quantum mechanical density matrices, where there really isn't a definite state. In this case, I would say that the actual, physical system does have a definite state; we simply use the density matrix to represent our ignorance of that state.

11. Apr 14, 2014

### FactChecker

I believe that this is what I was trying to remember -- an entangled pair where there isn't a definite state. Isn't this is an example of something that is intrinsically random?

12. Apr 14, 2014

### chogg

An entangled pair is neither more nor less random than a single particle. Entanglement doesn't add "intrinsic randomness". In fact, what makes it so interesting is the very non-random correlations between measurements on the entangled particles!

A density matrix is something else entirely. It has to do with uncertainty about what state a quantum system is in. (By contrast, an entangled state is a definite state of the system as a whole.)

Here's an instructive comparison.

In the first case, imagine a particle in the state $\frac{1}{\sqrt{2}}\left( |z_+\rangle + |z_-\rangle \right)$. Suppose this corresponds to $|x_+\rangle$ for the coordinates you've chosen. If you take a measurement of $S_x$, you'll get $\frac{1}{2}$ with probability 1. This corresponds to a superposition state.

In the second case, imagine a particle in state $|z_+\rangle$ with probability 0.5, and in state $|z_-\rangle$ with probability 0.5. If you measure $S_x$, you'll get $\frac{-1}{2}$ with probability 0.5, and $\frac{1}{2}$ with probability 0.5. This corresponds to the density matrix case. We're not just uncertain about some measurement outcomes; we're uncertain about the state of the particle itself.

13. Apr 15, 2014

### FactChecker

@sina89, I hope this is not hijacking your OP, but I think it goes to the hearth of your question. There were situations in quantum theory where there was a debate as to whether a system was in an internal "hidden state" that we just could not determine (Einstein believed this), or if its state was really still uncertain. I have been calling the latter case "intrinsically random", but @chogg has explained that this is a system with a density matrix. The thread he points to is very interesting. But I thought that the issue was settled. Experiments have been done where the probabilities of outcomes were different in the two cases (in a hidden state versus still random with a density matrix) . The results of the experiments showed that the system was not in a "hidden state". It was still in no fixed simple state, but rather has a density matrix. I think that this indicates that there are truly random processes.

14. Apr 15, 2014

### sina89

im happy that despite my seemingly general question the conversation went thr way i expected. actually this was a confusion for me since one of my friends was trying to tell me about quantum physics (my major is computer science and i don't have any education in the field of quantum physics but some hearsay) when he was talking about these intrinsically random states and made me confused and i felt need to ask some people specialized in the subject about this.
so can i conclude your statements as " there is consensus today among physicists about intrinsically random states, and they all(or mostly) believe that einstein's point of view is PROVEN wrong"? if quantum actually did prove this, i dont allow myself to question you about the details of how it is proved in quantum physics .

15. Apr 15, 2014

### sina89

this confusion made me think that maybe i have some misconception in my view of probability so i put my question here.

16. Apr 15, 2014

### MathematicalPhysicist

Well, I believe you cannot actually generate a real random sequence of events, cause "random" isn't really definable because the problem of infinite regression pops in which is not solvable.

17. Apr 15, 2014

### FactChecker

There is something going on now that may interest you. It seems that some extremely complicated particle physics (Feynman diagrams) can be done using relatively simple multi-dimensional geometry, called "Amplituhedrons". Amplituhedrons are so much simpler that they might be a profound breakthrough in our understanding. In Amplituhedrons, space, time, and probability are not fundamental, and "probabilities" do not always add up to 1. So it might change what probability and "random" means. Maybe nothing is random, as we think of it now.

18. Apr 15, 2014

### chogg

Quite so: local hidden variable theories have been ruled out.

The (main) reason I say randomness is an open question is this: the time-dependent Schrodinger equation is deterministic. The quantum state of your system evolves smoothly: if you know it at one time, you know it at all times.

Traditionally, the "randomness" is thought to come in during a measurement, where the smooth Schrodinger time evolution is said not to apply. If the system is not in an eigenstate, the measurement leaves it in one, but we can't say for sure which one; all we can do is give probabilities. The problem with this view is that the measurement device is itself a quantum mechanical system (and if it isn't, I would love to know where you obtained it!). If we consider the Schrodinger equation for the composite system -- measuring device plus original system -- the time evolution should once again be smooth.

This seems to contradict our experience of taking measurements. Everett showed, however, that the resulting total quantum state is a superposition of states in a basis we would find useful: something like $\frac{1}{\sqrt{2}} \left( |M_+\rangle |z_+\rangle + |M_-\rangle |z_-\rangle \right)$, where $|M_+\rangle$ means the measuring device registered a '$+$'. If you observe spin-up, there is another you who observed a spin-down in the same experiment.

This is commonly known as the (ineptly named) "Many Worlds" interpretation of quantum mechanics. It's not universally accepted, but it explains one way we could have an ultimately deterministic universe with the appearance of randomness.

19. Apr 15, 2014

### Chronos

Statistically, it is impossible to prove the result of any process is truly random. Even a coin that comes up heads 1000 consecutive times still has a much higher statistical possibility of being a 'fair' coin than you might suspect. To achieve a 3 sigma confidence interval on a pass - fail basis, you need nearly 1100 consecutive 'pass' outcomes, and there is still about a 1 in 370 chance it was just blind luck.

20. Apr 15, 2014

### chogg

Could you please clarify what you mean? In particular, I don't understand what you mean by "a 3 sigma confidence interval on a pass - fail basis". I'm also not sure which possibilities you're comparing -- is it only a perfectly fair coin and a perfectly biased coin, or are there other intermediate possibilities?

21. Apr 15, 2014

### FactChecker

Thanks, @chogg. This is fascinating stuff. I wish I understood more about it.

22. Apr 15, 2014

### chogg

You and me both! :-)

23. Apr 15, 2014

### Chronos

3 sigma corresponds to a 99.73% probability a process output will fall within a certain range of values. For an analog output [measured numerically], it follows a gaussian distribution. For a discrete output [pass fail], it follows a binomial distribution. Sigma is the term used to express the probability an output will fall within certain limiting values. 3 sigma is generally considered a reasonable probability a process output is, or is not random. In some applications, 3 sigma is considered pretty sloppy and higher confidence intervals are demanded [e.g. particle physics]. You can characterize the probability any particular outcome, or series of outcomes is, or is not random, but, never with certainty.

24. Apr 15, 2014

### chogg

I'm sorry, but I still don't think you've defined your terms very well, and I still have a hard time understanding what you're trying to say. It sounds like you're saying a coin showing heads 1100 times in a row only has a 99.73% chance to be non-random. If so, you're right that it's surprising and counter-intuitive, but I think intuition is actually correct here!

Let me explain myself more precisely. (After another couple iterations, I'll probably come to understand what you're saying.)

In my view, we've got two models.
• $\text{fair}$ means that the coin is fair: when flipped, it gives heads (H) with probability 0.5 and tails (T) with probability 0.5.
• $\text{biased}$ means that the coin has some other probability, $p$, to yield H. Since we don't know what that probability is a priori, we'll assign p a uniform distribution from 0 to 1.

We go and observe some data, and then we compute the probability for each model. That's going to depend on our prior probabilities for each model, $P(\text{fair})$ and $P(\text{biased})$. Different people can have different priors, so we'll factor them out for now and focus on the likelihood -- i.e., the probability each model assigns to the data you actually saw. In our case, this data was simply $\text{H}^N$: we saw heads (H) $N$ times in a row.

For the $\text{fair}$ model, the likelihood is simple:
$$P(\text{H}^N | \text{fair}) = \frac{1}{2^N}.$$

For the $\text{biased}$ model, we consider the likelihood for a given parameter value $p$; then we integrate over all possible values of $p$:
\begin{align} P(\text{H}^N | \text{biased}) &= \int\limits_0^1 p^N \,\,dp \\ &= \frac{1}{N + 1} \end{align}

Note that $P(\text{H}^N | \text{biased})$ drops very slowly compared to $P(\text{H}^N|\text{fair})$, which sinks like a stone. This means that our belief shifts towards the $\text{biased}$ model very rapidly, as we keep observing heads-and-only-heads.

Now let's set $N=1100$. The odds that the coin is fair are given by
\begin{align} \text{Odds}(\text{fair} | \text{H}^N) &= \frac{P(\text{fair})}{P(\text{biased})} \frac{P(\text{H}^N | \text{fair})}{P(\text{H}^N | \text{biased})} \\ &= \frac{1 - P(\text{biased})}{P(\text{biased})}\frac{N + 1}{2^N} \end{align}
When the odds are tiny compared to 1, the probability basically equals the odds. I'm going to go out on a limb and assume that $\text{Odds}(\text{fair} | \text{H}^N)$ is indeed tiny.

To be concrete, let's take equal priors, so that $P(\text{biased}) = 0.5$. Then we have $P(\text{fair} | \text{H}^{1100}) \approx \frac{1101}{2^{1100}} \approx 1.2 \times 10^{-329}$: this is basically zero. Even if you think the coin has only $1:10^{100}$ prior odds of being biased -- a prior which strains credibility beyond the breaking point -- you will still have only $P(\text{fair} | \text{H}^{1100}) \approx 1.2 \times 10^{-229}$ -- again, basically zero.

This is way beyond 3 sigma.

Have I missed something?

25. Nov 18, 2015

### sina89

"Thus quantum mechanics is a statistical theory. It can make definite predictions about ensembles of identical systems, but it can generally tell us nothing definite about an individual system. Where it differs from other statistical theories, such as statistical mechanics, weather forecasting or economics, is that the chance element is inherent in the nature of the quantum system and not merely imposed by our limited grasp of all the variables that affect the system."
in the paul davis's inroduction for "physics and philosophy"

i take the chogg's statement "randomness is an open question" as my answer. i'm quite happy with this because its very difficult for me to believe that anyone can assert something about intrinsicly randomness of events.