atyy said:
It's a very interesting question - how can Maxwell's demon be exorcised?
Nowadays, I hear that Landauer's principle has done it in. Thermodynamics only applies to systems with very large numbers of particles. So a demon with a finite amount of memory must eventually erase some memory, in order to keep on cleaning his room. In erasing the memory, the demon will cause entropy to increase.
Interestingly, "[URL [/URL] lists controversy about how far the principle can be generalized. These articles by http://arxiv.org/abs/quant-ph/0203017" look like good reads.
Yes, I find the limits of the second law interesting myself. I keep seeing the distinction between the state of a system and knowledge of the state of a system conflated in complex ways, especially when entropy is involved but sometimes in other situations.
Consider the limits imposed by the 3rd law alone. This implies a minimum entropy such that at some point entropy does not decrease in total, rather some subsystem increases entropy to the gain of another. Maximal entropy would then be defined as a state where the losses and gains have a Poisson distribution.
In terms of computer registers suppose instead of running out of registers the particular register that is overwritten by a new piece of information becomes unpredictable, but correlated with the new piece of information. Throws a bit of a loop into J. Bub's acquisition cost in the linked paper, because acquisition and erasure become consonant. So if you start with a blank state it can gain information up to the point when information acquisition and erasure balance out. Then to gain information above that it must be acquired through obtaining it from lucky encounters with a second system of slightly lower entropy.
So long as there is some minimal entropy under which the total entropy can be reduced no further, there is one class of algorithm that allows the entropy of subsystems to decrease fractally. But does not give anyone system any special advantaged information to locally reduce its own entropy. In biology this is evolution. In physics it is defined in the large number hypothesis. Consider Big Bang nucleosynthesis which entails super hot conditions for rapid production. Yet, given an enormous enough period of time the same thing could in principle occur at lower average temperatures. Requiring only very localized highly unusual events to produce the hot events needed. I am not making an argument against the Big Bang but essentially it is the same process resulting in particle productions that indefinitely lock in low entropy subsystems once they occur.
Except by shear weight of numbers the opportunities to further decrease entropy at the individual system level is no better than what is defined by the usual thermodynamic laws. But with enough subsystems and a minimum entropy there will be lottery winners so long as a finite band-gap, Planck > 0, exist for those lottery winning subsystems to hang onto their gains.
When we talk about information and what energy we can reliably derive from a low entropy source this game with the large number hypothesis becomes worthless. Because we cannot wait on a confluence of conditions to gain once out of countless events. The failures eat our gains. So we, as individual systems, can only talk about the usefulness of entropy in terms of information we have about a system, and not the information contained in that system. Our physically laws is also geared around available information in such a way that the actual information contained in that system cannot even be ascribed an ontological reality. This can often make entropy look like a moving target whenever we switch from one method of coding information to another.
That is how I conceptualize the situation anyway, where entropy is
always perfectly valid in terms of defining what I, or any particular system, can obtain to decrease local entropy, while doing so increases the total efficiency of entropy production to bring the lottery winners back into thermal equilibrium. It grows in a similar manner as a food web in ecology. However, I have seen pet theories roughly based on this kind of thinking that was patently absurd. I see it as merely a product of the large number hypothesis, finite fundamental band-gaps (Planck) or limits on the total entropy, and increases in the efficiency of returning the system to equilibrium at all scales.