- 7,702
- 3,800
Information theory defines the entropy in terms of a randomly fluctuating variable:
http://en.wikipedia.org/wiki/Shannon_entropy
Landauer (1961, IBM J. Res. Develop.) pointed out the relationship between aquiring, processing, and deleting information and free energy. When a bit is erased (at temperature T), kT ln(2) units of energy is dissipated into the environment.
Reading a memory state is a sequence of measurements (which may be reversible, but may not be) and the source of the signal is considered to be transmitting entropy at a certain rate.
Some people get confused, trying to relate the *change* in entropy associated with reading a bit- which decreases the receiver's uncertainty- with the *absolute* entropy of the message itself. The absolute entropy is given by Kolmogorov (1965, Prob. Inform. Transmission) and relates to the *minimum* number of bits required to specify the memory state. Thus, the entropy transmission rate is *not* just the data transfer rate, but also reflects the transmission rate of the information (algorithmic information).
But the bottom line is that different messages have different entropies, and thus different energies, and thus different masses. At this point it is helpful to calculate: writing a random 3 TB string of bits requires at least (3*10^12)*kT ln(2) = 8.6*10^-9 Joules. When I read the message, my free energy is increased by that amount. If I erase the memory, I must dissipate that amount of free energy. And since I can read the message in a closed box, the transfer of energy is between me and the memory.
8.6 * 10^-9 J= 8.5 * 10^-16 kg. Good luck trying to measure that.
http://en.wikipedia.org/wiki/Shannon_entropy
Landauer (1961, IBM J. Res. Develop.) pointed out the relationship between aquiring, processing, and deleting information and free energy. When a bit is erased (at temperature T), kT ln(2) units of energy is dissipated into the environment.
Reading a memory state is a sequence of measurements (which may be reversible, but may not be) and the source of the signal is considered to be transmitting entropy at a certain rate.
Some people get confused, trying to relate the *change* in entropy associated with reading a bit- which decreases the receiver's uncertainty- with the *absolute* entropy of the message itself. The absolute entropy is given by Kolmogorov (1965, Prob. Inform. Transmission) and relates to the *minimum* number of bits required to specify the memory state. Thus, the entropy transmission rate is *not* just the data transfer rate, but also reflects the transmission rate of the information (algorithmic information).
But the bottom line is that different messages have different entropies, and thus different energies, and thus different masses. At this point it is helpful to calculate: writing a random 3 TB string of bits requires at least (3*10^12)*kT ln(2) = 8.6*10^-9 Joules. When I read the message, my free energy is increased by that amount. If I erase the memory, I must dissipate that amount of free energy. And since I can read the message in a closed box, the transfer of energy is between me and the memory.
8.6 * 10^-9 J= 8.5 * 10^-16 kg. Good luck trying to measure that.
Last edited: