AdrianMay
- 121
- 4
I'm tending to think that the 2nd law honestly doesn't apply to every conceivable system. Why should it? It says that we expect systems to go from less likely states to more likely ones, but it was all worked out in the context of gases, which exhibit vast numbers of microstates and blindly wandering molecules. That means we can get away with using very naive probability theory. Why should that be generalisable to computers with very deliberate wiring and small numbers of states? Not micro-, not macro-, just states.
A computer like that doesn't have anything we can *relevantly* identify as temperature, and why should it? We can have TdS+pdV systems that don't need FdL, or TdS+FdL without pdV, so why not pdV+FdL without any mention of temperature or entropy. A computer at 0 Kelvin would probably work rather well. If the program is basically cyclic then there is no accumulation of entropy in its logical state. If you want to claim that it's oozing entropy, then you'd have to mean that it dissipates heat somewhere in the universe, but is there really any prospect of finding a *mechanical* reason for putting a numerical limit on how much heat it has to dissipate? Maybe it runs on neutrinos or something.
What does likely mean anyway? In thermodynamics it means microstates per macrostate, but we seem to have agreed that macrostates are in the eye of the beholder. There's been no discussion of conditional probability, or dependent probabilities. The whole treatment of probability has been restricted to what dumb gases need.
There's been some suggestions that Turing machines can just as easily run backwards as forwards, but I don't see this either. I just need a 2 bit counter and I've got a one way system. I'm probably going to need to feed it some energy, but the cannonball gas argument already showed that this energy can be made negligible compared with the energy being switched around by the demon, so we concluded that it's not about the energy anyway. If it's not about the energy though, I can have my 2 bit counter, it can be arbitrarily efficient, and the entropy I need to generate elsewhere to keep it running can be made arbitrarily small.
With that cannonball gas we can actually shift a hell of a lot of entropy per decision. To see that, let the balls be ever so slightly inelastic and let them eventually dissipate their KE by warming the cannonballs over a long, long time. The demon was supposedly struggling to run on a limited entropy budget, but with big cannonballs he can afford to drink a bit of electricity. This has been my bottom line worry all along. We can hand-wave about this stuff, but can we write equations for it in units that match up?
I really think it's on the cards that computers could walk all over the 2nd law, but what would be the damage? Would we have to bin all the books? Nope. Gases would still behave just the same, and computers would still be ten orders away from finding out. I can't think of an area of physics that would totally implode if computers could break the 2nd law.
A computer like that doesn't have anything we can *relevantly* identify as temperature, and why should it? We can have TdS+pdV systems that don't need FdL, or TdS+FdL without pdV, so why not pdV+FdL without any mention of temperature or entropy. A computer at 0 Kelvin would probably work rather well. If the program is basically cyclic then there is no accumulation of entropy in its logical state. If you want to claim that it's oozing entropy, then you'd have to mean that it dissipates heat somewhere in the universe, but is there really any prospect of finding a *mechanical* reason for putting a numerical limit on how much heat it has to dissipate? Maybe it runs on neutrinos or something.
What does likely mean anyway? In thermodynamics it means microstates per macrostate, but we seem to have agreed that macrostates are in the eye of the beholder. There's been no discussion of conditional probability, or dependent probabilities. The whole treatment of probability has been restricted to what dumb gases need.
There's been some suggestions that Turing machines can just as easily run backwards as forwards, but I don't see this either. I just need a 2 bit counter and I've got a one way system. I'm probably going to need to feed it some energy, but the cannonball gas argument already showed that this energy can be made negligible compared with the energy being switched around by the demon, so we concluded that it's not about the energy anyway. If it's not about the energy though, I can have my 2 bit counter, it can be arbitrarily efficient, and the entropy I need to generate elsewhere to keep it running can be made arbitrarily small.
With that cannonball gas we can actually shift a hell of a lot of entropy per decision. To see that, let the balls be ever so slightly inelastic and let them eventually dissipate their KE by warming the cannonballs over a long, long time. The demon was supposedly struggling to run on a limited entropy budget, but with big cannonballs he can afford to drink a bit of electricity. This has been my bottom line worry all along. We can hand-wave about this stuff, but can we write equations for it in units that match up?
I really think it's on the cards that computers could walk all over the 2nd law, but what would be the damage? Would we have to bin all the books? Nope. Gases would still behave just the same, and computers would still be ten orders away from finding out. I can't think of an area of physics that would totally implode if computers could break the 2nd law.