The theory of everything help

  • Thread starter Duhoc
  • Start date
  • #1
56
0

Main Question or Discussion Point

This is from an article from "Quantum Frontiers."

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.
Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?
 

Answers and Replies

  • #2
bapowell
Science Advisor
Insights Author
2,243
259
Who says these things aren't complex? Generally, randomness implies maximum complexity -- a truly random sequence by definition is one that cannot be compressed by any algorithm, and so is maximally complex according to the Kolmogorov-Chaitin conception of complexity.
 
  • #3
56
0
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
 
  • #4
56
0
Which, when you think about it, is exactly what the computer program is trying to do.
 
  • #5
bapowell
Science Advisor
Insights Author
2,243
259
Thanks for answering. Seems there were many views and very few replies. But is randomness complexity? Just because you can't reduce data to a computer string why does that mean it is complex. It just may mean it is jibberish or noise. This as opposed to incorporating many functions to achieve a larger function.
You will find that the formal definitions of complexity and information content don't jibe well with our colloquial understanding of these terms. For example, the Shannon information content of a string of gibberish is greater than a well-formed sentence in English.

Shannon based his definition on the idea that an informative message should "surprise" us, in the sense that subsequent pieces of the message should be unpredictable (i.e. any given character of the message has a low prior probability). This does make some sense -- if you know what the message is going to say before you receive it, in what sense is it informative?
 
Last edited:
  • #6
328
28
You can eliminate randomness through sequence or pattern constraint within a given system. It is safe to say that randomness is "all possible complexities" constrained within a unitary system. It is a set of non repeating infinite functions.

No. Computer program no matter how random or non repeating/noisy it may seem will end up as a pattern. Pi is the only mind construction that involves a sequence that will run forever.
 
Last edited:
  • #7
bapowell
Science Advisor
Insights Author
2,243
259
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about [itex]\pi[/itex]?
 
  • #8
D H
Staff Emeritus
Science Advisor
Insights Author
15,393
682
Duhoc, you are making too much out of the word "complexity". Kolmogorov complexity is a measure of the length of the shortest program needed to replicate some string. There's a lot in common between Kolmogorov complexity and Shannon entropy, and there's a lot in common between Shannon entropy and the concept of entropy in physics and chemistry.

BTW, entropy is another one of those words that garners a lot of confusion.
 
  • #9
328
28
What about e, the base of the natural log? Or any irrational number for that matter? What's so special about [itex]\pi[/itex]?
OT. Yep It should be irrational numbers. If it's 'useful' and integral part of approximation then it is special to someone. Anyhow. There is a somewhat asymmetry between repeating and non repeating function. It tells you something on the nature of dynamics in terms of mathematical functions and similarly in nature. There are more non-repeating non-terminating decimals than any multitudes of terminating decimal numbers and non-terminating repeating decimals. Irrational number in a system are the only dynamic function than can run infinitely and undirected that can contain all possible numerical backdrop for any constraint un/imaginable. That's how i view reality in terms of relational aspects in math. You can't have diversity of structures(complexity) without randomnness. You cant have structures without pattern and you can't have temporal constraint(change) without asymmetry. Like repeating carbon bonds that forms a temporal less complex structure such as diamonds in comparison to a more complex bonds/system like us. (sorry for being philosophical).
 
Last edited:

Related Threads for: The theory of everything help

  • Last Post
Replies
11
Views
2K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
5
Views
4K
Replies
21
Views
5K
  • Last Post
2
Replies
25
Views
8K
  • Last Post
Replies
3
Views
3K
Replies
39
Views
2K
  • Last Post
2
Replies
47
Views
3K
Top