What is the connection between Kolmogorov complexity and randomness?

Click For Summary

Discussion Overview

The discussion centers on the relationship between Kolmogorov complexity and randomness, exploring definitions of complexity, the nature of randomness, and how these concepts relate to information theory and mathematical functions. Participants examine whether randomness can be equated with complexity and the implications of these definitions in various contexts.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Exploratory

Main Points Raised

  • Some participants propose that Kolmogorov complexity is defined by the ability to condense data into a simpler form, suggesting that higher complexity corresponds to less predictability.
  • Others argue that randomness implies maximum complexity, as a truly random sequence cannot be compressed, aligning with the Kolmogorov-Chaitin conception of complexity.
  • A participant questions whether randomness should be considered complexity, suggesting that incompressibility might indicate gibberish rather than true complexity.
  • Another viewpoint emphasizes that the definitions of complexity and information content differ from colloquial understandings, citing Shannon's definition of information content based on unpredictability.
  • One participant asserts that randomness can be eliminated through patterns within a system, proposing that randomness encompasses all possible complexities constrained within a unitary system.
  • There is a discussion about the nature of irrational numbers, particularly \(\pi\) and \(e\), and their relevance to the concepts of complexity and randomness.
  • A participant notes the similarities between Kolmogorov complexity and Shannon entropy, while also highlighting the confusion surrounding the term "entropy" in various fields.
  • Another participant philosophically connects randomness and complexity, suggesting that diversity in structures requires randomness and that patterns are essential for complexity.

Areas of Agreement / Disagreement

Participants express differing views on the definitions and implications of complexity and randomness, with no consensus reached on whether randomness equates to complexity or how these concepts should be interpreted in various contexts.

Contextual Notes

Participants highlight the limitations of definitions and the potential for confusion surrounding terms like complexity and entropy, indicating that the discussion relies on varying interpretations and assumptions.

Duhoc
Messages
55
Reaction score
0
This is from an article from "Quantum Frontiers."

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.
Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?
 
Space news on Phys.org
Who says these things aren't complex? Generally, randomness implies maximum complexity -- a truly random sequence by definition is one that cannot be compressed by any algorithm, and so is maximally complex according to the Kolmogorov-Chaitin conception of complexity.
 
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
 
Which, when you think about it, is exactly what the computer program is trying to do.
 
Duhoc said:
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
You will find that the formal definitions of complexity and information content don't jibe well with our colloquial understanding of these terms. For example, the Shannon information content of a string of gibberish is greater than a well-formed sentence in English.

Shannon based his definition on the idea that an informative message should "surprise" us, in the sense that subsequent pieces of the message should be unpredictable (i.e. any given character of the message has a low prior probability). This does make some sense -- if you know what the message is going to say before you receive it, in what sense is it informative?
 
Last edited:
You can eliminate randomness through sequence or pattern constraint within a given system. It is safe to say that randomness is "all possible complexities" constrained within a unitary system. It is a set of non repeating infinite functions.

No. Computer program no matter how random or non repeating/noisy it may seem will end up as a pattern. Pi is the only mind construction that involves a sequence that will run forever.
 
Last edited:
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about [itex]\pi[/itex]?
 
Duhoc, you are making too much out of the word "complexity". Kolmogorov complexity is a measure of the length of the shortest program needed to replicate some string. There's a lot in common between Kolmogorov complexity and Shannon entropy, and there's a lot in common between Shannon entropy and the concept of entropy in physics and chemistry.

BTW, entropy is another one of those words that garners a lot of confusion.
 
bapowell said:
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about [itex]\pi[/itex]?

OT. Yep It should be irrational numbers. If it's 'useful' and integral part of approximation then it is special to someone. Anyhow. There is a somewhat asymmetry between repeating and non repeating function. It tells you something on the nature of dynamics in terms of mathematical functions and similarly in nature. There are more non-repeating non-terminating decimals than any multitudes of terminating decimal numbers and non-terminating repeating decimals. Irrational number in a system are the only dynamic function than can run infinitely and undirected that can contain all possible numerical backdrop for any constraint un/imaginable. That's how i view reality in terms of relational aspects in math. You can't have diversity of structures(complexity) without randomnness. You can't have structures without pattern and you can't have temporal constraint(change) without asymmetry. Like repeating carbon bonds that forms a temporal less complex structure such as diamonds in comparison to a more complex bonds/system like us. (sorry for being philosophical).
 
Last edited:

Similar threads

  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
632
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 47 ·
2
Replies
47
Views
9K
Replies
10
Views
5K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 106 ·
4
Replies
106
Views
16K
  • · Replies 36 ·
2
Replies
36
Views
7K