low entropy humans?
Hi, Flux,
Flux said:
It would seem to me that the universe is made up of certain Low Entropies here and there that could potentially turn into allot more Low Entropy like Humans for instance becoming more intelligent and organizing the universe.
Whoa! Back up a minute.
Humans are hardly "organizing the universe". And a low entropy human (in the sense most commonly used in physics) is probably more like a frozen corpse than a functional cosmologist...
Lurking in this I sense the idea that "higher entropy states exhibit, in some sense, greater disorganization", which one might think, implies that conversely, "more organized states should have lower entropy". But to try to make sense of this, you need to know what you mean by "state", "entropy of a state" and "organization of a state". And once you try to become precise, it all gets a lot more complicated--- and a lot more interesting!
In fact,
there are many possible definitions of "entropy", and not all are equivalent. This is particularly true when you start mixing up biology with physics.
Some quite different looking definitions of "entropy" do turn out to have close relationships under various circumstances. For example, Shannon entropy H({\mathcal A}) = -\sum_{j=1}^r \, \mu(A_j) \, \log \mu(A_j)
can be formulated (following Kolmogorov) in terms of a "finite measureable partition" {\mathcal A}, i.e. X = \uplus_{j=1}^r \, A_j where the A_j are measureable subsets of a probability space (X,\mu). Another, Boltzmann entropy, is formulated in terms of a finite partition of a set, namely the log of the obvious multinomial coefficient, i.e. the size of the orbit under a suitable group action by the symmeteric group S_r. Yet these turn out to be closely related quantities. Indeed, as von Neumann pointed out to Shannon, by a strange historical accident, Shannon entropy originally arose in statistical physics as an approximation to Boltzmann entropy, even though most now agree that if history were logical, information theory could and should have predated twentieth century physics. Also falling into this group of close relatives is another important entropy from dynamical systems theory, the topological entropy.
But some similar looking definitions turn out to capture rather different intuitive notions; for example, compare the notion of Shannon entropy--- I swear I'll scream if anyone calls this "Shannon-Wiener entropy", or even worse, "Shannon-Weaver entropy"--- with the notion of "Kullback-Liebler divergence", aka "cross-entropy", aka "discrimination", etc.) D({\mathcal A}, \mu | \nu) = \sum_{j=1}^r \, \mu(A_j) \, \log \left( \mu(A_j)/\nu(A_j) \right)
Some definitions have few if any known mathematical relations, but appear to be trying to capture somewhat related intuitive ideas. And some appear to have little relation to each other.
(Similar remarks hold for "state" and "organization".)
Let me try to elaborate a bit on my claim that biological notions of "complexity" might not be related in any simple way to the notions from dynamical systems theory/information theory which I mentioned above.
There are many different definitions on entropy used in statistical mechanics, which certainly cannot define "the same quantity", if for no other reason than that they are not defined on the same domain, but in addition, these quantities are often numerically different even when both are defined; hence they are distinct.
These entropies belong to the group clustering around Shannon entropy which I very roughly described above, and they do to some extent conform to the slogan ""higher entropy states exhibit, in some sense, greater disorganization". As others have already pointed out, however, this should be taken to refer to a "closed system" and the Earth is not a closed system; rather, we have an energy flux Sun -> Earth -> Deep space.) But the point I am trying to get at here is that intended sense of "organization" is probably different from what you have in mind when you spoke of human acitivity allegedly "lowering entropy".
Now think about this: how much information does it take to define a bacterium? A redwood tree? A human? More than a decade ago I used to argue with biologists that the then common assumption that the complexity of an organism is simply something like "the Shannon entropy of its genome" is highly questionable. From what I've already said you can probably see that isn't even well-defined as stated, but there are reasonable ways to fix this. The real problem is: is this Shannon entropy an appropriate measure of "biocomplexity"?
I struggled to explain my expectation that Shannon entropies are inadequate to capture biological intuition about the kind of "complexity" which often interests, let us say, evolutionary biologists. My point then as now was that depending upon context, there are many things one might mean by "biocomplexity" or "biotic organization" and there is no reason to expect that these notions must all be measured by the same mathematical quantity. Quite the opposite--- one should expect quite different theories to emerge once one has found appropriate definitions.
For example, perhaps without conciously realizing it, many people think of complexity as
superadditive, which means simply that "the complexity of the whole is greater than the sum of the complexity of its parts". But Shannon entropy (and Boltzmann entropy) are
subadditive: "the entropy of the whole is less than the sum of the entropy of its parts". (Roughly speaking.) This is a feature which these entropies share with classical Galois theory (the lemma we need is a triviality concerning indices of subgroups, which is sometimes attributed to none other than Henri Poincare), and this is not a coincidence.
At the level of a single organism, I also pointed out that biologically speaking, it seems that a genome by itself does not define a typical modern organism (not even a virus), because it requires rather complicated "cellular machinery" to transcribe the DNA (or RNA) into protein. If we admit that our mathematical theory should not presume to accomplish anything unnatural, it follows that our theory should not "define" the biocomplexity of an organism in terms of the genome alone. Presumably one must also take account of the "overhead" associated with having a working instance of all that complex cellular machinery before you can even start transcribing, i.e. "living" (at the level of a cell).
And as you have probably noticed, defining the complexity of a biosphere is probably a rather different enterprise from defining the complexity of a single organism!
Flux said:
Right now we view the universe as mainly high Entropy and moving toward that but how does intelligent life factor in. Life that seemingly gets more and more organized without and t logical explanation since Entropy tends toward High Entropy.
I also used to caution biologists against assuming that in a typical biosphere, under natural selection we should expect a biosphere to become more and more "complex". For one thing, this doesn't mean much if one hasn't offered a well-motivated mathematical theory with a notion of complexity which can be applied to construct mathematical models of evolving biospheres. For another, there is really no reason to expect a
monotonic increase of "biotic complexity". Much earlier, the noted biologist George C. Williams expressed some similar caveats.
BTW, Claude Shannon's Ph.D. thesis applied abstract algebra to population genetics! (In his highly original master's thesis, he had previously applied mathematical logic to found the theory of switching circuits.)
Flux said:
The Big Bang theory says that the universe is going toward High Entropy from Low Entropy,
Again, it's not nearly that simple. In fact, I have often said that I know of no subject more vexed in modern science. Even worse, with the rise of political movements masquerading as fringe science, such as "intelligent design", this already vexed area has been further burdened with unwanted (and entirely spurious) political baggage.
Flux said:
but there are allot of clues that there is allot of low Entropy that evolves toward lower Entropy like on Earth with animals and humans that could possibly overtake the mainly high entropy through the universe..
A good place to begin reading might be an old and often cited essay by Freeman Dyson on the notion of "heat death" and its malign implications for thought processes rather generally defined. Some of the specifics have been overtaken by subsequent revolutions in cosmology, but this still makes excellent and thought provoking reading. Much recent work traces its roots back to this essay, or even earlier. See http://prola.aps.org/abstract/RMP/v51/i3/p447_1
Curious readers may also consult Peter Walters, An Introduction to Ergodic Theory, Springer, 1982, or Karl Peterson, Ergodic Theory, Cambridge University Press,1983 for the ergodic theory formulations of Shannon entropy used above and its relationship to topological entropy. Compare this with Cover and Thomas, Elements of Information Theory, Wiley, 1991. (There are many excellent books on ergodic theory and on classical information theory, but these are perhaps the best for the purpose at hand.)