- #1

- 56

- 0

## Main Question or Discussion Point

This is from an article from "Quantum Frontiers."

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.

Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?

The fundamental concept here is Kolmogorov complexity and its connection to randomness/predictability. A sequence of data bits like:

10011010101101001110100001011010011101010111010100011010110111011110

has higher complexity (and hence looks more random/less predictable) than the sequence:

10101010101010101010101010101010101010101010101010101010101010101010

So is everyone agreed that this is the proper definition of complexity; ie the ability to condense raw data to a simple code.

Why isn't the ability of a function to integrate other functions a measure of complexity? A car has many different parts, many different systems, as does a human being. Why aren't these systems regarded as complex? Is there a distinction to be drawn between randomness and complexity?