Wait, what?Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message.
1s and 0s ARE microstates. The map of all possible trajectories in the system's phasespace IS the information about the system. If you only have one bit, you only have two microstates (yes, they call them 1 and 0, but names aren't important, we can call them state a and state b).
A system that is in steady state has 0 bits (it has no states to change to so you don't even need a 1 or 0 to express it's state).
"intelligible" has nothing to do with it. The "message" could be a billiard ball impacting another billiard ball. It's the energy, the wave, the propagation of a disturbance, not the matter itself.
Maxwell's demon requires information to do work.
What do you think of what Kolmogorov has done with information theory? (metric entropy, Kolmogorov complexity)If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider.