In Information Theory, entropy is defined as the unpredictability of information content and, as such, the entropy of the output from so-called pseudo random number generators (PRNG) is often measured as a test of their "randomness". An interesting paradox arises with this definition...(adsbygoogle = window.adsbygoogle || []).push({});

Start with a suitable seed, [itex]S_n[/itex], consisting of an array ofnpreviously generated pseudo-random numbers from the function [itex]PRNG(seed)[/itex]. In order to maximize the randomness of [itex]PRNG[/itex] we want [itex]PRNG(S_n)[/itex] to return a result such that the entropy of [itex]S_{n+1}[/itex] is also maximized. Such a result can be distinct, given a suitable seed. However, in an effort to maximize the randomness of [itex]PRNG[/itex] we have now created a process which is deterministic and completely predictable, which directly contradicts the proclaimed unpredictability of information content through high entropy.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# "Randomness Through Entropy" Paradox

Loading...

Similar Threads - Randomness Through Entropy | Date |
---|---|

A PDF of random motion - similar to Browninan motion | Jan 27, 2018 |

B Is there a definition of randomness? | Dec 14, 2017 |

I Generating a random sample with a standard deviation | Aug 11, 2017 |

A Keeping Randomized Variable in Regression? | May 29, 2017 |

Propagating uncertainties through Gaussian fit | Jun 8, 2015 |

**Physics Forums - The Fusion of Science and Community**