Naty1
- 5,605
- 40
It's an erasure of a bit that requires energy, not the creation of a bit...
http://en.wikipedia.org/wiki/Landauer's_principle
Landauer's Principle, first argued in 1961[1] by Rolf Landauer of IBM, holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment". (Bennett 2003)[2].
Specifically, each bit of lost information will lead to the release of an amount kT ln 2 of heat, where k is the Boltzmann constant and T is the absolute temperature of the circuit. On the other hand, if no information is erased, computation may in principle be achieved which is thermodynamically reversible, and require no release of heat. This has led to considerable interest in the study of reversible computing.
The principle is widely accepted as physical law; but in recent years it has been challenged...
...For a computational operation in which 1 bit of logical information is lost, the amount of entropy generated is at least k ln 2, and so the energy that must eventually be emitted to the environment is E ≥ kT ln 2.
http://en.wikipedia.org/wiki/Landauer's_principle