# Does true randomness exist?

## Main Question or Discussion Point

I have seen some controversy among this question and would very much appreciate it if somebody is able to give a good straight answer (if possible) with some proof to back it up.
Does true randomness exist in the universe at a subatomic, higher or perhaps lower level?
This is in perspective to the universe in a sense, not random meaning that we just don't understand all the variables that constitute an outcome. A truly random outcome that no matter how much knowledge and data that is available, it is impossible to predict?

EDIT: I think this would be the right section for this question, but am greatly sorry if it is not.

Related Other Physics Topics News on Phys.org
I have seen some controversy among this question and would very much appreciate it if somebody is able to give a good straight answer (if possible) with some proof to back it up.
Does true randomness exist in the universe at a subatomic, higher or perhaps lower level?
This is in perspective to the universe in a sense, not random meaning that we just don't understand all the variables that constitute an outcome. A truly random outcome that no matter how much knowledge and data that is available, it is impossible to predict?

EDIT: I think this would be the right section for this question, but am greatly sorry if it is not.
You can in fact model the behavior of random variables in a general sense, in terms of probability distributions. These can assign a probability measure to a given outcome. However a finite sequence of digits that look very well ordered, say 121212121212.......12. could be the result of a random process (with a very low probability).

On the other hand, a random looking sequence of digits could be generated by a deterministic function. The extended digit sequence of pi is such a sequence. Sections of its digit sequence have been sampled and used as sequences of "random numbers". This is true of other irrational numbers as well. The fact is, the answer to your question is unknown. Some believe that natural processes are the result of hidden deterministic processes. Others believe that quantum randomness is the true state of nature. This does not prevent such processes from being well modeled by statistical methods.

Last edited:
It's never possible to be sure that the underlying reality involves randomness, but what we know is that we can't see further than that. To say that the universe behaves randomly at a fundemental level is to apply Occam's Razor and to say, that which we can never possibly percieve, cannot be part of our universe or the laws of nature.

For a proof you'll need to look to local hidden variable interpretations and the Bell inequality, but even then you can't preclude a hidden variable interpretation that recovers determinism if it ignores relativsitic treatment of causality, space and time.

If you want an entertaining history of this argument then search for 'god does not play dice'.

Last edited:
Khashishi
Quantum mechanics is usually regarded as truly random. The wavefunctions evolve deterministically when there is no measurement, but any time a measurement is taken, the wavefunction collapses into a new random state.

• DuckAmuck
bhobba
Mentor
Does true randomness exist in the universe at a subatomic, higher or perhaps lower level?
No one can say right now because the most powerful tests for randomness we currently have are passed by some sophisticated, but deterministic, pseudo random number generators.

It is thought, and I agree as well, that quantum randomness is truly random - but proving it is another matter.

Thanks
Bill

No one can say right now because the most powerful tests for randomness we currently have are passed by some sophisticated, but deterministic, pseudo random numbejr generators.

It is thought, and I agree as well, that quantum randomness is truly random - but proving it is another matter.

Thanks
Bill
I think if we take this line then we can say with certainty that we'll never be able to prove anything is truly random. I don't have a mathematical proof for this, but I'm going to propse that it'll always be a lot easier to improve pseudo random number generation than it will be to improve the detection method. I don't have much doubt that a proof is already known for this amongst cryptographers and information theorists.

I think with quantum physics, we'll always be left with having to presume that there are random processes taking place until proven otherwise. For the proof to work the other way, we would require near infinite information.

Last edited:
in quantum mechanics , quantum particles can only be predicted in random terms ,
maybe thats because they are different from bigger things that we see each day , or maybe because they just are based on chances and randomness .
but what you need to know is that in quantum mechanics , knowing position or momentum of a particle is based on chance . not because we have not devised something to know it precisely , but because this is as precise as we can get , as we know now . quantum particles are fundamentally ruled by probability

Jano L.
Gold Member
I have seen some controversy among this question and would very much appreciate it if somebody is able to give a good straight answer (if possible) with some proof to back it up.
Does true randomness exist in the universe at a subatomic, higher or perhaps lower level?
I think the controversy is largely due to unclear meaning of "true randomness".

In my understanding, the most clear use of the word "random" is this:

the process of generation of a sequence of bits is random when the knowledge of previous bits does not help us at all to predict the next bit.

Whether the process is random or not thus depends not only on the process, but also on our knowledge of its inner working.

In this view, what is random is the process of generation, not the sequence of numbers it produces. For example, above random process can produce sequence that starts with 1010101010101010, and such sequence is as likely as any other.

If the process is arithmetic (like computer random number generator), it is always possible to learn and implement it on a computer to predict the next bit. Before we are able to do this, the process is random; after we do this, the process loses its randomness and becomes predictable.

If the process is physical, i.e. fluctuations in electrical voltage across resistor, or flipping a coin, it is also possible to learn more about how it works and it may be possible to use this knowledge to predict the next values. For example, measuring the fluctuations of height of the sea level and subsequent analysis into Fourier modes enables us to predict the future values of sea level for some time into future. In principle, there is no reason to restrict the ability of scientist to understand and predict the natural processes.

However, there are many physical processes for which this kind of forecast is currently practically not possible, for example if the bits are generated based on the fluctuations in electrical voltage, Geiger detector counts or coin flipping.

These last processes of generation are random because we currently cannot use the past values of physical quantities to predict the next value. As with the height of the sea level, this may change in the future but it is very likely that there will be always good many processes that remain random.

If QM isn't really random, wouldn't that require hidden variables (to keep track of the non-randomness) which have already been excluded by the Bell's inequality experimental results? I think the idea that radioactive decay might not really be random is completely positively excluded.

If QM isn't really random, wouldn't that require hidden variables (to keep track of the non-randomness) which have already been excluded by the Bell's inequality experimental results? I think the idea that radioactive decay might not really be random is completely positively excluded.
The Bell experiments only demonstrate that there is no local realism. A hidden variable formulation that doesn't require locality is still viable. I'm not sure why you'd want to, but you could. You'd be violating relativistic causality. Perhaps you could contain this to your hidden world. I don't know.

Last edited:
Jano L.
Gold Member
If QM isn't really random, wouldn't that require hidden variables (to keep track of the non-randomness) which have already been excluded by the Bell's inequality experimental results? I think the idea that radioactive decay might not really be random is completely positively excluded.
Quantum theory is a probabilistic theory, so in the sense I explained above the processes influencing the configuration of the system or other quantities are random. That does not preclude that there may be another theory of these processes in which they are not random. See mechanics in relation to statistical physics.

People like to stress that due to this or that no-go theorem all non-quantum physics effort is futile.
They often proceed roughly along this:

1) propose some unnaturally restrictive conditions on how the other theory should look like
2) show that it violates Bell type inequality or causality
3) show that nature obeys Bell type inequality and causality
hence
4) no other theory than quantum theory is worth studying.

The problem with this is that 1) is very wrong thing to do if we are interested in further development of theoretical physics. 3) is questionable since there are "loopholes" in the experiments. 4) is very bad motivation if we are interested in further development of theoretical physics, and it does not follow from 1,2,3.

EDIT: radioactive decay may be governed by unknown equations. We know little about what happens in nucleus.

Last edited:
To ask what the ultimate "true state" of nature is seems to be a philosophical, not scientific question. We build models to explain and predict the results of observations and experiments. For a long time, these models were deterministic. Statistical models were introduced to deal with processes that were too subtle or complex to explain with deterministic models, but the thinking was that with enough knowledge, a deterministic model could exist. However, Bohr held that for quantum mechanics, so such deterministic substratum exists and this view is probably the majority view among physicists today. Deterministic models are still used. Chaos theory is deterministic and the evolution of the wave function is described by deterministic equations. We use what works.

Last edited:
Consider this:
Atoms don't age - That is older atoms are as likely to decay as newly created ones.

Consequently, the evolution of atoms cannot be governed by an internally evolving mechanism inside of the nucleus. If it did, older nuclei would be further along their internal evolution process and would be more likely to decay. How can one possibly reconcile those facts with any non-random decay theory. I don't see how. - Note that i discard out of hand any non-local explanation.

jbriggs444
Homework Helper
2019 Award
If one nails down what is meant by "random" well enough then a mathematical answer exists.

An infinite binary string S is "deterministic" if there is a Turing machine T that produces S as output. Otherwise it is "random".

An infinite binary string is "N-deterministic" for an integer n if there is a Turing machine T of size less than or equal to n that produces S as output. Otherwise it is N-random.

For any given string S, Turing machine R that correctly reports whether S is random? Yes, of course. There are only two possible answers. There is a Turing machine that reports "yes" and there is a Turing machine that reports "no".

Is there a single fixed Turing machine R that will, for any string S correctly report in finite time whether S is "random"? No. No matter how many digits of S are sampled, there are Turing machines that hard-code the digits of S up to that point plus one. So there are continuations of S that end up "deterministic" and continuations of S that end up "random".

Is there a single fixed finite Turing machine R that takes as input integer n and infinite string S and, if S is "n-random", always correctly reports this in finite time, but if S is "n-deterministic" will not falsely report that S is "n-random"? No...

Diagonalization quickly yields a contradiction. If such a TM were assumed to exist then it could be modified to produce a fixed finite machine of size n that produces output that is N-random.

If one nails down what is meant by "random" well enough then a mathematical answer exists.
I don't think anyone is disputing the mathematical foundation for randomness. It's whether randomness is the "ultimate" description of nature at the smallest scales.

Last edited:
Jano L.
Gold Member
the evolution of atoms cannot be governed by an internally evolving mechanism inside of the nucleus. If it did, older nuclei would be further along their internal evolution process and would be more likely to decay.
(italic mine)
The implication in your reasoning is not correct. High age of the atom does not imply that it has to have different probability distribution (on time axis) of decay than newly created atom.

It is easy to think of deterministic process that behaves in the same way as radioactive decay, for example, efusion - escape of the helium atoms from the balloon (imagine atoms as balls and balloon with scarce holes). No matter how long the period the atom has spent inside the balloon is, in the instant we know it is in the balloon, it has exactly the same chances to leave. The scatter of the final leaving times is due to scatter in the initial conditions of the atoms, not due to "really random escape".

I think you are asking if physics is deterministic - if everything is determined e.g. by past situation, or maybe physics indeed makes some really random decisions, like while quantum measurements.

All fundamental physics we use, from QFT to GRT, is Lagrangian mechanics - what means that evolution is fully determined by Euler-Lagrange equations. So fundamentally there is no place for randomness. However e.g. because of imperfections of our measurements, we cannot know exact conditions required to simulate the evolution, so from our perspective there is always lots of "true randomness" in microscopic physics.

If fundamental physics is deterministic, the natural question is: so what about wavefunction collapses like quantum measurements?
So how does "practical QM" we use look like? We just take e.g. a single electron into considerations, approximate everything around - and it is not surprising that in such approximated model we are loosing information in "these wavefunction collapses".
More formally, there is always some exterior - environment we didn't take into considerations in our "practical QM" ... or maybe we have taken, but as thermodynamical averaging. Random interactions with this environment are seen as the cause of the wavefunction collapse.
So what if we would extend our "practical QM" into exact quantum description of larger and larger systems - finally getting to the "wavefunction of the Universe" - in this finally "fundamental QM" picture, there is no longer interacting external environment, so there are no longer wavefunction collapses - it would just evolve in unitary-deterministic way.

(italic mine)
The implication in your reasoning is not correct. High age of the atom does not imply that it has to have different probability distribution (on time axis) of decay than newly created atom.
That's my point isn't it?
It is easy to think of deterministic process that behaves in the same way as radioactive decay, for example, efusion - escape of the helium atoms from the balloon (imagine atoms as balls and balloon with scarce holes). No matter how long the period the atom has spent inside the balloon is, in the instant we know it is in the balloon, it has exactly the same chances to leave. The scatter of the final leaving times is due to scatter in the initial conditions of the atoms, not due to "really random escape".
Analogy doesn't apply. If the system is really deterministic than an old atom in a box will be more likely to exit through the small hole than a new one. Of couse that doesn't apply to the real effusion process because real effusion processes are not classical and deterministic. They are quantum and random. But if you actually solve the problem of a deterministic particle in a box (no QM), the probability for any individual particle finding the hole in the next second evolves over time.

Jano L.
Gold Member
An infinite binary string S is "deterministic" if there is a Turing machine T that produces S as output. Otherwise it is "random".

An infinite binary string is "N-deterministic" for an integer n if there is a Turing machine T of size less than or equal to n that produces S as output. Otherwise it is N-random.

For any given string S, Turing machine R that correctly reports whether S is random? Yes, of course. There are only two possible answers. There is a Turing machine that reports "yes" and there is a Turing machine that reports "no".
Such meaning of randomness works only with hypothetical infinite sequences that are not a result of any algorithm. It seems quite useless for the original question. The sequence of measured values in physics is always finite.

Jano L.
Gold Member
If the system is really deterministic than an old atom in a box will be more likely to exit through the small hole than a new one.
As far as I know, in a deterministic model of atomic gas (for example, hard spheres) the atoms leave independently and the statistics of leaving times is that of Poisson process, the same as for radioactive decay.

Could you please explain what is your definition of "system is really deterministic", and how did you come into your conclusion?

kith
Quantum theory is a probabilistic theory, so in the sense I explained above the processes influencing the configuration of the system or other quantities are random. That does not preclude that there may be another theory of these processes in which they are not random.
Such theories exist already (de Broglie-Bohm) but as long as we cannot observe the "true", non-random elements of the theory in experiments, they are of not much use to explain the randomness. I suspect that no future more fundamental theory than QM will explain the randomness by observable deterministic elements.

The randomness in QM seems to arise from the very fact that an experiment is performed which means that the observer and the observed get entangled if we take QM literally.

Quantum theory is a probabilistic theory (...)
"Practical QM" we use are indeed probabilistic theories - because there is always e.g. exterior - environment we ignore in our considerations - interactions with this environment disturb our simplified model in looking randomly way, leading to wavefunction collapse.
But if we would extend our quantum consideration to the whole Universe, in such finally "fundamental QM" there would be no longer exterior leading to wavefunction collapse - evolution of wavefunction of the Universe would be purely unitary: deterministic.

Last edited:
analogdesign