Q_Goest
Science Advisor
- 3,012
- 42
Hi Andy,
1111100000
1010010011
It may be difficult or impossible to pin down entropy values for the physical states of switches, so this may not seem obvious at first. However, the storage of information doesn't depend on the use of switches or flash drives. We could equally store information in the form of a pressurized gas. We could have small containers with pressure switches attached to them and all we'd need to do is to pressurize or depressurize the containers so the pressure switch could change state. The entropy of the gas in any given container is dependant only on the gas in that container, and not on the entropy of the gas in any other container. Similarly, I'm going to assume the entropy of any pressure switch correlates to one of two positions. Any remaining wiring also has the same entropy state when there is no current flowing through it (ie: when the system is "sitting on a table").
Note also that I'm assuming for this particular thought experiment, that the system has come to thermal equilibrium with its environment and is of uniform temperature. If one contests that the temperature must change when the information is read, I would agree*. When this system is 'read', we pass electric current through the wires and the wires must heat up. And if we change the state of any of the pressurized gas containers, there is also a change in the temperature of those containers. But once the system comes back to equilibrium with the environment (assuming an environment with an infinite thermal mass) the total entropy is still a simple summation of the entropy of the individual parts.
Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.
*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.
Do you deny that the entropy of any physical system can be determined from the simple summation of the entropy of the individual parts? I believe that's what one has to defend in order to suggest that entropy can vary depending on the sequence of the physical states of a system (ie: the information entropy). But that's not true; the entropy of a physical system is a simple summation of the entropy of its parts. For example, if the physical state of a 1 has the entropy value of 1 J/K and the physical state of a 0 has the entropy value of 0.1 J/K, then regardless of how the 1's and 0's in a system are arranged, if two systems have the same number of 0's and 1's and the two systems have the same, uniform temperature, the two systems also have the same entropy. For example, there is no difference in the entropy of the following two systems:Andy Resnick said:Parenthetically, I am opposed to the reductionist approach-
1111100000
1010010011
It may be difficult or impossible to pin down entropy values for the physical states of switches, so this may not seem obvious at first. However, the storage of information doesn't depend on the use of switches or flash drives. We could equally store information in the form of a pressurized gas. We could have small containers with pressure switches attached to them and all we'd need to do is to pressurize or depressurize the containers so the pressure switch could change state. The entropy of the gas in any given container is dependant only on the gas in that container, and not on the entropy of the gas in any other container. Similarly, I'm going to assume the entropy of any pressure switch correlates to one of two positions. Any remaining wiring also has the same entropy state when there is no current flowing through it (ie: when the system is "sitting on a table").
Note also that I'm assuming for this particular thought experiment, that the system has come to thermal equilibrium with its environment and is of uniform temperature. If one contests that the temperature must change when the information is read, I would agree*. When this system is 'read', we pass electric current through the wires and the wires must heat up. And if we change the state of any of the pressurized gas containers, there is also a change in the temperature of those containers. But once the system comes back to equilibrium with the environment (assuming an environment with an infinite thermal mass) the total entropy is still a simple summation of the entropy of the individual parts.
Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.
*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.