- #1
revo74
- 72
- 0
The idea that entropy is the same as disorder is false. This is a misconception many people make or have made, including myself.
A person wrote this in response to the statement "Entropy does NOT mean disorder."
Is this correct? Please elaborate on this. Additionally, isn't thermodynamic entropy different than information entropy?
A person wrote this in response to the statement "Entropy does NOT mean disorder."
Is this correct? Please elaborate on this. Additionally, isn't thermodynamic entropy different than information entropy?
“Technically you are right. The entropy S is:
S = – Σ P(i) log P(i)
where P(i) is the probability of a particle in a state i, and Σ means the sum. It was Boltzmann who advocated the idea that entropy was related to disorder. In Boltzmann's mind, the more ways a system could move internally, the more disorderly the system was. A system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement whatsoever, hence S would be at a minimum. An example of low entropy (high order) would be ice, while water vapor would be high disorder, high entropy.
That was in the 19th century, and this concept prevailed until the mid 20th century. It was Shannon in the 1940's who revolutionized the field of thermodynamics. He defined entropy in terms of information. If my knowledge of the system is high, entropy is low. If my lack of information is high entropy is high. This is the definition currently accepted. It's the one that Susskind used to derive his groundbreaking concept of the holographic principle. Instead of order/disorder you have entropy as the amount of information. The bigger the system, that is the greater the number of degees the system has, the greater is my lack of information in terms of which state the particle is in. A gas is an example of a system with high entropy, the molecules occupied a greater number of states with greater velocities, my lack of information is therefore very high, hence entropy is high. In Boltzman's term, "a system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement", that would mean I know where the particles are in the case of perfect order, my knowledge is high, my lack of information is low, hence entropy is low. So you can see that order/disorder and amount of information are equivalent. They don't contradict each other.
Now take the case of the Big Bang. Initially, the universe occupied a tiny volume, the size of Planck length. My knowledge of where are all the particles is high (low entropy). As the universe expands, that is, it occupies more and more volume, it's harder for me to keep track of each particle, my lack of knowledge grows, so entropy increases. I can still look at this scenario in terms of order/disorder. Initially, the universe has low entropy ( high order), as it expands, it becomes more disorder ( entropy increases). In either description, there is no contradiction.
Also see : Entropy_(information_theory)”