Energy required to cycle a counter through its states.

In summary: Reversible Computing FAQ. In summary, a reversible computer dissipates energy of an arbitrarily small non-zero amount.
  • #1
Adrian B
21
5
Greetings,

While attempting to learn something about cryptography, I have repeatedly encountered a commonly quoted argument about the minimum energy required to cycle a 256 bit counter through all its states. It says that the absolute minimum energy required to change the state of a bit is kT and that if you multiply that by the number of bit toggles required to cycle through all the values of a 256 bit counter you'd need more energy than you could obtain by building a Dyson sphere around a supernova. See Schneier on Security for the argument as he originally stated it.

Here's what I'm wondering: If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

Thanks in advance for any comments.
 
Physics news on Phys.org
  • #2
I'm too tired right now to really think about your idea, but I think that you are misunderstanding a part of the Landauer principle which has to do with erasure of information that happens when a bit is changed. Since it is heat lost when information is erased, you cannot get the energy back. Take a quick look at it and see if that answers your question. Others may have better insight, or I may have misunderstood what you're getting at.
 
  • #3
Adrian B said:
Greetings,

If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

Thanks in advance for any comments.



A gate does not change its state randomly because of thermal noise, if there's some friction, or if there's some kind of lock keeping the gate shut or open. Lock is some kind of gate, so friction has to be there in a counter device made of gates.

Friction may be very small if temperature is very low. From that we can conclude:
1) Cooling something with non-tiny heat capacity to very low temperature takes a very large amount of energy.
2) A 256-bit counter has some minimum heat capacity.
 
  • #4
Adrian B said:
Here's what I'm wondering: If I build a counter such that it requires energy "E" to change the state from 0 to 1, can't I design the counter such that I recover energy up to E when the state returns from 1 to 0? If so, wouldn't that mean that the maximum energy invested in an n-bit counter would have a lower bound of nE, with the greatest energy investment occurring when the counter reaches its final state with all bits equal to 1? If minimum E is kT, then the minimum investment would be nkT, no? For a 256-bit counter, that's a lot less than the energy expended by a supernova.

No, for the same reason you can't keep an electrical motor running forever by using it to drive a generator that charges the battery that is connected to the motor. There is a limit to the amount of heat you can recover if you want to do useful work; and if you run through the theory it turns out that you will always loose at least an energy kT when you flip a bit.

(the actual argument is more subtle and goes all the way back to the physics of Maxwell's demon, and as far as I remember kT is actually the amount of energy needed to erase a bit, but the end result is the same)
 
  • #5
Thanks for the replies. I'm having trouble reconciling these two assertions:
  • Flipping a bit will always require at least energy kT.
  • It's possible to build a reversible computer that dissipates energy of an arbitrarily small non-zero amount (Reversible Computing FAQ).
Do the two assertions above reconcile as T→0? If so, then I guess:
  • That's why Schneier assumes an ambient temp of 3.2°K.
  • My original post errs by ignoring the energy that would be required to bring local ambient temp down to some value arbitrarily close to zero.
Am I on the right track?

Thanks again for your help.
 
  • #6
Adrian B said:
Thanks for the replies. I'm having trouble reconciling these two assertions:
  • Flipping a bit will always require at least energy kT.
  • It's possible to build a reversible computer that dissipates energy of an arbitrarily small non-zero amount (Reversible Computing FAQ).
Do the two assertions above reconcile as T→0? If so, then I guess:


  • I don't think you need to go to zerk Kelvin. The reason for kT being the minimum energy is -as I pointed out above- quite subtle (which is why I took something like a 100 year to solve this problem, it is exactly the same problem as Maxwell's demon) and I am not an expert. However, as far as I remember it turns out that kT is only a limit for irreversible gates (meaning most gates used in normal computer); the reason being that the cost in energy comes when you "forget" information. Hence, if you build you computer using only reversible gates (say Hadamard gates) there is no limit to the minimum energy. However, the "penalty" is that you have to retain ALL information which quite quickly becomes a problem (and you can't use e.g. AND gates so the logic become tricky as well).
 
  • #7
f95toli said:
...as far as I remember it turns out that kT is only a limit for irreversible gates ...; the reason being that the cost in energy comes when you "forget" information. Hence, if you build you computer using only reversible gates ... there is no limit to the minimum energy. However, the "penalty" is that you have to retain ALL information which quite quickly becomes a problem ...

Ok, so there's no minimum energy for a reversible computation. So, if I use reversible gates to build a machine that goes from state 0 to state 1 to state 0 in time T, I can use the energy recovered from one run of the machine to power the machine's next run. The only energy I put into the machine is at time 0, to start the process. Since the upper bound on the reversible machine's efficiency is 100%, the lower bound on the initial energy input, E, required to drive the machine through X runs is whatever energy is required to get the machine through one run. No?

I think that a set of n such machines, each with its own period (T, 2T, 4T, ..., 2n-1T), can collectively be considered to be a single n-bit counter, if the individual machines are started simultaneously. The period T machine will run through the most cycles so I'm assuming it will be the hungriest -- let's say it requires initial energy E to get through 2n-1 cycles. If so, the entire set of n machines will require collective input energy no greater than nE to run through all 2n states of an n-bit counter, once.

If the machines are efficient enough, E is a small. If I'm building a 256-bit counter, n is small. nE doesn't turn out to be an astronomical amount of energy. No?
 
  • #8
  • Like
Likes 1 person

1. How is the energy required to cycle a counter through its states calculated?

The energy required to cycle a counter through its states is calculated by multiplying the power consumption of the counter by the time it takes to complete one full cycle. This can be expressed as: energy = power x time.

2. What factors affect the energy required to cycle a counter through its states?

The energy required to cycle a counter through its states is affected by several factors, including the type of counter (e.g. mechanical, electronic), the number of states it has, the frequency of cycling, and the efficiency of the counter.

3. Can the energy required to cycle a counter through its states be reduced?

Yes, the energy required to cycle a counter through its states can be reduced by using more efficient counters, reducing the number of states, or decreasing the frequency of cycling. Additionally, using renewable energy sources can also help reduce the overall energy consumption.

4. How does the energy required to cycle a counter through its states affect its lifespan?

The energy required to cycle a counter through its states can have a direct impact on its lifespan. The more energy it consumes during each cycle, the faster its components may wear out, leading to a shorter lifespan. Therefore, reducing the energy consumption can help prolong the lifespan of the counter.

5. Is there a standard measurement for the energy required to cycle a counter through its states?

There is no standard measurement for the energy required to cycle a counter through its states as it can vary depending on the factors mentioned earlier. However, the SI unit for energy is joules (J) and this can be used to measure the energy consumed by the counter during each cycle.

Similar threads

  • Classical Physics
Replies
18
Views
2K
Replies
4
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
4K
  • Mechanical Engineering
Replies
8
Views
2K
  • Quantum Physics
Replies
3
Views
1K
  • Atomic and Condensed Matter
Replies
0
Views
377
  • Introductory Physics Homework Help
Replies
3
Views
1K
Replies
31
Views
2K
  • Introductory Physics Homework Help
Replies
7
Views
1K
Replies
130
Views
8K
Back
Top