Energy required to cycle a counter through its states.

AI Thread Summary
The discussion centers on the energy required to cycle a 256-bit counter through all its states, referencing the minimum energy needed to change a bit's state, which is kT. There is a debate about whether energy can be recovered when a bit returns to its original state, suggesting that the total energy investment could be lower than previously thought. However, it is clarified that the Landauer principle indicates energy cannot be fully recovered due to the irreversible nature of information erasure. The conversation also touches on the potential of reversible computing, which may allow for lower energy requirements, but this comes with complexities in maintaining information. Ultimately, while reversible gates could theoretically reduce energy costs, practical limitations in computing design remain a significant factor.
Adrian B
Messages
21
Reaction score
5
Greetings,

While attempting to learn something about cryptography, I have repeatedly encountered a commonly quoted argument about the minimum energy required to cycle a 256 bit counter through all its states. It says that the absolute minimum energy required to change the state of a bit is kT and that if you multiply that by the number of bit toggles required to cycle through all the values of a 256 bit counter you'd need more energy than you could obtain by building a Dyson sphere around a supernova. See Schneier on Security for the argument as he originally stated it.

Here's what I'm wondering: If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

Thanks in advance for any comments.
 
Physics news on Phys.org
I'm too tired right now to really think about your idea, but I think that you are misunderstanding a part of the Landauer principle which has to do with erasure of information that happens when a bit is changed. Since it is heat lost when information is erased, you cannot get the energy back. Take a quick look at it and see if that answers your question. Others may have better insight, or I may have misunderstood what you're getting at.
 
Adrian B said:
Greetings,

If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

Thanks in advance for any comments.



A gate does not change its state randomly because of thermal noise, if there's some friction, or if there's some kind of lock keeping the gate shut or open. Lock is some kind of gate, so friction has to be there in a counter device made of gates.

Friction may be very small if temperature is very low. From that we can conclude:
1) Cooling something with non-tiny heat capacity to very low temperature takes a very large amount of energy.
2) A 256-bit counter has some minimum heat capacity.
 
Adrian B said:
Here's what I'm wondering: If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

No, for the same reason you can't keep an electrical motor running forever by using it to drive a generator that charges the battery that is connected to the motor. There is a limit to the amount of heat you can recover if you want to do useful work; and if you run through the theory it turns out that you will always loose at least an energy kT when you flip a bit.

(the actual argument is more subtle and goes all the way back to the physics of Maxwell's demon, and as far as I remember kT is actually the amount of energy needed to erase a bit, but the end result is the same)
 
Thanks for the replies. I'm having trouble reconciling these two assertions:
  • Flipping a bit will always require at least energy kT.
  • It's possible to build a reversible computer that dissipates energy of an arbitrarily small non-zero amount (Reversible Computing FAQ).
Do the two assertions above reconcile as T→0? If so, then I guess:
  • That's why Schneier assumes an ambient temp of 3.2°K.
  • My original post errs by ignoring the energy that would be required to bring local ambient temp down to some value arbitrarily close to zero.
Am I on the right track?

Thanks again for your help.
 
Adrian B said:
Thanks for the replies. I'm having trouble reconciling these two assertions:
  • Flipping a bit will always require at least energy kT.
  • It's possible to build a reversible computer that dissipates energy of an arbitrarily small non-zero amount (Reversible Computing FAQ).
Do the two assertions above reconcile as T→0? If so, then I guess:


  • I don't think you need to go to zerk Kelvin. The reason for kT being the minimum energy is -as I pointed out above- quite subtle (which is why I took something like a 100 year to solve this problem, it is exactly the same problem as Maxwell's demon) and I am not an expert. However, as far as I remember it turns out that kT is only a limit for irreversible gates (meaning most gates used in normal computer); the reason being that the cost in energy comes when you "forget" information. Hence, if you build you computer using only reversible gates (say Hadamard gates) there is no limit to the minimum energy. However, the "penalty" is that you have to retain ALL information which quite quickly becomes a problem (and you can't use e.g. AND gates so the logic become tricky as well).
 
f95toli said:
...as far as I remember it turns out that kT is only a limit for irreversible gates ...; the reason being that the cost in energy comes when you "forget" information. Hence, if you build you computer using only reversible gates ... there is no limit to the minimum energy. However, the "penalty" is that you have to retain ALL information which quite quickly becomes a problem ...

Ok, so there's no minimum energy for a reversible computation. So, if I use reversible gates to build a machine that goes from state 0 to state 1 to state 0 in time T, I can use the energy recovered from one run of the machine to power the machine's next run. The only energy I put into the machine is at time 0, to start the process. Since the upper bound on the reversible machine's efficiency is 100%, the lower bound on the initial energy input, E, required to drive the machine through X runs is whatever energy is required to get the machine through one run. No?

I think that a set of n such machines, each with its own period (T, 2T, 4T, ..., 2n-1T), can collectively be considered to be a single n-bit counter, if the individual machines are started simultaneously. The period T machine will run through the most cycles so I'm assuming it will be the hungriest -- let's say it requires initial energy E to get through 2n-1 cycles. If so, the entire set of n machines will require collective input energy no greater than nE to run through all 2n states of an n-bit counter, once.

If the machines are efficient enough, E is a small. If I'm building a 256-bit counter, n is small. nE doesn't turn out to be an astronomical amount of energy. No?
 
Back
Top