# Unified Theory of Chaos?

I've been doing a lot of reading on chaos lately. Things like the butterfly effect, the logistic equation, the Lorenz attractor, etc. It's all great and wonderful but I'm finding things very un-unified.

I understand that certain nonlinear systems can sometimes exhibit chaotic behavior, but without getting into the mathematical details, it seems that no one can really explain why. The chaotic functions themselves all seem arbitrary and very different from one another. They produce different patterns, with different properties, and sometimes fail to produce chaos at all. It appears to me (the uninitiated observer) that all the chaotic functions we know of were stumbled upon by accident, or derived from complex mathematical formula.

I suppose what I'm asking is, does there exists a generalized formula for chaos? One that includes in it the abilities of all the others? Or at least a uniting principle in the construction of chaotic functions? One that I could use to devise my own functions for my own needs?

I'm sorry if I'm being vague. It's hard for me to explain what I'm asking...

atyy
I don't think there is a general theory. "Chaos" can be seen in low and high dimensional attractors. There are also things like "transient chaos" and "stable chaos" which have nothing to do with attractors, but are properties of transients.

Some places where the theory is organized is in Hamiltonian systems, which can be divided into integrable and non-integrable systems, and the major result is the Kolmogorov–Arnold–Moser theorem.

Another is the classification of the "roads to turbulence" found in on p647 of http://users-phys.au.dk/fogedby/chaos/Eckmann81.pdf .

A good general reference is http://chaosbook.org/ .

Pythagorean
Gold Member
Chaos is not a theory, it's the name of a behavior. And it can be quantified by the lyapunov exponent (how sensitive the system is to it's starting conditions).

I wouldn't say no on can explain way. All physical systems are different with different initial conditions.

The case with chaos is that the systems are so sensitive to initial conditions that humans have to be careful with teir linear-based assumptions about "significance" when talking about the contribution of a behavior to a phenomena.

Just out of curiosity, what do you mean by "un-unified"? My own complaint would be that many textbooks on chaos have a less than satisfactory definition of what it is in the first place. Three old criteria for chaos in an iterative mapping f:X --> X are the following,

1) there exists a dense set of periodic orbits.
2) the mapping is "topologically mixing".
3) the mapping displays "sensitivity to initial conditions."

where the quoted terms are defined relatively rigorously depending on what kind of space you are looking at. "sensitivity to initial conditions" basically means that for any point in the domain, there exists another point arbitrarily close that will (after sufficient time or iterations of the mapping) be farther than some finite distance "b" from the starting point.
You can look up the definitions of topologically mixed and dense on wikipedia (it is a more abstract definition and it wouldn't be worth it for me to repeat it here: intuitively, a mapping that is topologically mixing will send points from some non-empty open set all over the domain of the mapping in some non-trivial distribution).

The motivation for studying chaotic systems comes in part from the study of differential equations. For instance, in a continuous chaotic system you cannot "succinctly" label the solutions the way you can when solving linear differential equations. More complexity is apparent since some of the governing differential equations may be locally non-invertible for all approximation time intervals (i.e. two close but distinct points in the past get mapped to the same point, usually on some negligible set; however, there are many negligible sets that are dense in the sets that contain them: the rational numbers, for instance, have negligible "volume" with respect to the Lebesgue measure and are dense in the reals).

Last edited:
Just out of curiosity, what do you mean by "un-unified"? My own complaint would be that many textbooks on chaos have a less than satisfactory definition of what it is in the first place. Three old criteria for chaos in an iterative mapping f:X --> X are the following,

1) there exists a dense set of periodic orbits.
2) the mapping is "topologically mixing".
3) the mapping displays "sensitivity to initial conditions."

where the quoted terms are defined relatively rigorously depending on what kind of space you are looking at. "sensitivity to initial conditions" basically means that for any point in the domain, there exists another point arbitrarily close that will (after sufficient time or iterations of the mapping) be farther than some finite distance "b" from the starting point.
You can look up the definitions of topologically mixed and dense on wikipedia (it is a more abstract definition and it wouldn't be worth it for me to repeat it here: intuitively, a mapping that is topologically mixing will send points from some non-empty open set all over the domain of the mapping in some non-trivial distribution).

The motivation for studying chaotic systems comes in part from the study of differential equations. For instance, in a continuous chaotic system you cannot "succinctly" label the solutions the way you can when solving linear differential equations. More complexity is apparent since some of the governing differential equations may be locally non-invertible for all approximation time intervals (i.e. two close but distinct points in the past get mapped to the same point, usually on some negligible set; however, there are many negligible sets that are dense in the sets that contain them: the rational numbers, for instance, have negligible "volume" with respect to the Lebesgue measure and are dense in the reals).

Sure, the definition of chaos is one example of what I'm talking about. Periodic orbits, topologically mixing, and sensitivity to initial conditions are all "symptoms" of chaos, but they do little to explain what it actually is. I'd prefer a more holistic definition that gets to the fundamental roots of chaos.

When ever I read an article or paper on chaos they always talk about specific manifestations of chaos. Such as the logistic function, the Lorenz attractor, etc. These are indeed functions which exhibit chaos but they do not represent chaos itself. Either I am, or everyone else is, missing the big picture. Hence "un-unified".

To draw a comparison:

A tree has life, but not all life is trees.
Humans are intelligent, but not all intelligence is human.
The logistic function is chaotic, but not all chaos is the logistic function.

Does there exist such thing as a universal equation for chaos? One for which all other chaotic functions can be derived?

The logistic function is chaotic, but not all chaos is the logistic function.Does there exist such thing as a universal equation for chaos? One for which all other chaotic functions can be derived?

You may get a "universal equation" for chaos if the GUT is ever identified. (Grand Unified Theory).

I think of chaos as what happens when God is not paying attention. :tongue2:

Pythagorean
Gold Member
Does there exist such thing as a universal equation for chaos? One for which all other chaotic functions can be derived?

The first question sounds silly; "universal" is a suspicious term in the sciences. We might be able to get somewhere with your second question. You want to design a chaotic system?

A system is chaotic if it has a positive maximal lyapunov exponent. The maximal lyapunov exponent can be found by: (pic stolen from wikipedia)

You can test this numerically (approximately), by perturbing the system (by the amount Zo) via the equation). This means you have two systems that start in the exact same location, but then you move one of them a very tiny amount away from the starting condition of the other. Then some times later, you measure the distance between the trajectories of the two systems through phasespace. If, with these tiny perturbations, the systems keep on diverging exponentially, it will have a positive lyapunov exponent. In a high-dimensional case, the systems can diverge wildly, leading to rampant bifuraction and a qualitative change in results form these small perturbations.

So a mathematician would be able to design and derive an equation so that it satisfies the conditions above probably quite easily.

Or... You can choose to understand phase space as a geometric object and map a bifurcation diagram that you think will yield the results you want, then design the nullclines from there, and finally find some equations that will match the nullclines you choose.

The first question sounds silly; "universal" is a suspicious term in the sciences. We might be able to get somewhere with your second question. You want to design a chaotic system?

A system is chaotic if it has a positive maximal lyapunov exponent. The maximal lyapunov exponent can be found by: (pic stolen from wikipedia)

You can test this numerically (approximately), by perturbing the system (by the amount Zo) via the equation). This means you have two systems that start in the exact same location, but then you move one of them a very tiny amount away from the starting condition of the other. Then some times later, you measure the distance between the trajectories of the two systems through phasespace. If, with these tiny perturbations, the systems keep on diverging exponentially, it will have a positive lyapunov exponent. In a high-dimensional case, the systems can diverge wildly, leading to rampant bifuraction and a qualitative change in results form these small perturbations.

So a mathematician would be able to design and derive an equation so that it satisfies the conditions above probably quite easily.

Or... You can choose to understand phase space as a geometric object and map a bifurcation diagram that you think will yield the results you want, then design the nullclines from there, and finally find some equations that will match the nullclines you choose.

I'm getting the feeling this has gone above my head.

Alright, I understand the concept and function of the lyapunov exponent and I understand that it can be used as a measure of chaos. However, like you said, it would require a mathematician to derive a chaotic equation. If chaotic equations are suppose to be simple, and they are suppose to exist everywhere nonlinear functions are found, then why is it so difficult to derive a new chaotic equation? It just seems like a lot of guesswork to me.

Pythagorean
Gold Member
I'm getting the feeling this has gone above my head.

Alright, I understand the concept and function of the lyapunov exponent and I understand that it can be used as a measure of chaos. However, like you said, it would require a mathematician to derive a chaotic equation. If chaotic equations are suppose to be simple, and they are suppose to exist everywhere nonlinear functions are found, then why is it so difficult to derive a new chaotic equation? It just seems like a lot of guesswork to me.

It is a lot of guess work the way you ask the question. When we try to actually model real systems in nature and they turn out to be chaotic, it is not generally intentional that the are chaotic. When Hodgkin Huxley derived the equation to describe the currents of the squid axon, I doubt he had chaos in mind, but people interested in chaos in nature have picked up such equations and played with them. When there are a large number of such chaotic systems interacting (i.e. a "network") you really have just one very giant, high-dimensional, chaotic system; there is a lot of parameter regimes you can spend a lot of time investigating, and still not find them all. You can connect the elements of the network in different ways, populate them with different distributions of parameters, add more of them, etc. and find the different behaviors they exhibit.

But as a scientist, I never had to guess the equations, "chaotic" is just a word to classify a class of equations that I use and, to some extent, the complexity of their behavior.

Sure, the definition of chaos is one example of what I'm talking about. Periodic orbits, topologically mixing, and sensitivity to initial conditions are all "symptoms" of chaos, but they do little to explain what it actually is. I'd prefer a more holistic definition that gets to the fundamental roots of chaos.

When ever I read an article or paper on chaos they always talk about specific manifestations of chaos. Such as the logistic function, the Lorenz attractor, etc. These are indeed functions which exhibit chaos but they do not represent chaos itself. Either I am, or everyone else is, missing the big picture. Hence "un-unified".

To draw a comparison:

A tree has life, but not all life is trees.
Humans are intelligent, but not all intelligence is human.
The logistic function is chaotic, but not all chaos is the logistic function.

Does there exist such thing as a universal equation for chaos? One for which all other chaotic functions can be derived?

Well, the way I see it is this: according to the (relatively concise) definition above, a mapping is chaotic if and only if it satisfies those three criteria (actually, only two specific criteria need to be satisfied in general-- the third is implied). The only way to "test" whether a function is chaotic is to prove that it either satisfies all of the properties or fails one of them.

I'm not sure if I know what you mean by a "universal equation," but it sounds like you're trying to find a way of "indexing" all existing chaotic functions according to some continuous transformation rule (i.e. a chaotic function g(x) is equal to M(k,f(x)), where k is some parameter that can be adjusted to act on f(x)). If it were possible to classify all chaotic functions so simply, however, they wouldn't be nearly as interesting as they are.
Usually we can make statements about "classes" of functions (for instance, the logistic map can be thought of as a simple transformation of the tent map, so they are part of the same class).