What is the connection between Kolmogorov complexity and randomness?

AI Thread Summary
Kolmogorov complexity measures the complexity of data based on the length of the shortest program needed to reproduce it, linking randomness to maximum complexity since truly random sequences cannot be compressed. The discussion raises questions about the distinction between complexity and randomness, suggesting that randomness may not equate to meaningful complexity but rather to noise or gibberish. Shannon's information theory adds another layer, where surprising or unpredictable messages are deemed more informative, complicating the relationship between randomness and complexity. The conversation also touches on the nature of functions and patterns, emphasizing that complexity arises from the interplay of randomness and structure. Ultimately, the relationship between complexity, randomness, and information remains nuanced and subject to interpretation.
Duhoc
Messages
55
Reaction score
0
This is from an article from "Quantum Frontiers."

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.
Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?
 
Space news on Phys.org
Who says these things aren't complex? Generally, randomness implies maximum complexity -- a truly random sequence by definition is one that cannot be compressed by any algorithm, and so is maximally complex according to the Kolmogorov-Chaitin conception of complexity.
 
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
 
Which, when you think about it, is exactly what the computer program is trying to do.
 
Duhoc said:
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
You will find that the formal definitions of complexity and information content don't jibe well with our colloquial understanding of these terms. For example, the Shannon information content of a string of gibberish is greater than a well-formed sentence in English.

Shannon based his definition on the idea that an informative message should "surprise" us, in the sense that subsequent pieces of the message should be unpredictable (i.e. any given character of the message has a low prior probability). This does make some sense -- if you know what the message is going to say before you receive it, in what sense is it informative?
 
Last edited:
You can eliminate randomness through sequence or pattern constraint within a given system. It is safe to say that randomness is "all possible complexities" constrained within a unitary system. It is a set of non repeating infinite functions.

No. Computer program no matter how random or non repeating/noisy it may seem will end up as a pattern. Pi is the only mind construction that involves a sequence that will run forever.
 
Last edited:
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about \pi?
 
Duhoc, you are making too much out of the word "complexity". Kolmogorov complexity is a measure of the length of the shortest program needed to replicate some string. There's a lot in common between Kolmogorov complexity and Shannon entropy, and there's a lot in common between Shannon entropy and the concept of entropy in physics and chemistry.

BTW, entropy is another one of those words that garners a lot of confusion.
 
bapowell said:
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about \pi?

OT. Yep It should be irrational numbers. If it's 'useful' and integral part of approximation then it is special to someone. Anyhow. There is a somewhat asymmetry between repeating and non repeating function. It tells you something on the nature of dynamics in terms of mathematical functions and similarly in nature. There are more non-repeating non-terminating decimals than any multitudes of terminating decimal numbers and non-terminating repeating decimals. Irrational number in a system are the only dynamic function than can run infinitely and undirected that can contain all possible numerical backdrop for any constraint un/imaginable. That's how i view reality in terms of relational aspects in math. You can't have diversity of structures(complexity) without randomnness. You can't have structures without pattern and you can't have temporal constraint(change) without asymmetry. Like repeating carbon bonds that forms a temporal less complex structure such as diamonds in comparison to a more complex bonds/system like us. (sorry for being philosophical).
 
Last edited:
Back
Top