# The theory of everything help

## Main Question or Discussion Point

This is from an article from "Quantum Frontiers."

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.
Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?

bapowell
Who says these things aren't complex? Generally, randomness implies maximum complexity -- a truly random sequence by definition is one that cannot be compressed by any algorithm, and so is maximally complex according to the Kolmogorov-Chaitin conception of complexity.

Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.

Which, when you think about it, is exactly what the computer program is trying to do.

bapowell
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
You will find that the formal definitions of complexity and information content don't jibe well with our colloquial understanding of these terms. For example, the Shannon information content of a string of gibberish is greater than a well-formed sentence in English.

Shannon based his definition on the idea that an informative message should "surprise" us, in the sense that subsequent pieces of the message should be unpredictable (i.e. any given character of the message has a low prior probability). This does make some sense -- if you know what the message is going to say before you receive it, in what sense is it informative?

Last edited:
You can eliminate randomness through sequence or pattern constraint within a given system. It is safe to say that randomness is "all possible complexities" constrained within a unitary system. It is a set of non repeating infinite functions.

No. Computer program no matter how random or non repeating/noisy it may seem will end up as a pattern. Pi is the only mind construction that involves a sequence that will run forever.

Last edited:
bapowell
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about $\pi$?

D H
Staff Emeritus
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about $\pi$?