Entropy: why measure disorder?

In summary: HHHTTTTT". If there were some rule that guaranteed we would always get heads or tails on the first try, then the answer would be something like "the sequence of heads and tails was HH". In summary, Rob says that entropy is a measure of the number of different microstates that yield the same macrostate, and that order is difficult to define.
  • #1
Robert Webb
9
1
Something I've always wondered: why do we measure the amount of disorder (entropy) rather than the amount of order?

We don't measure brightness by the amount of "dark". Surely order is the thing of interest, so why don't we measure that rather than measuring the absence of it?

And in conversation it always feels wrong! I end up just talking about the amount of order rather than the amount of entropy which is always counter-intuitive.

Thanks,
Rob.
 
Science news on Phys.org
  • #2
I often wondered the same thing. It sounds illogical. But then I thought of this case.

Consider gas particles in a room. The more knowledge we have about their positions and velocities, the less the entropy. The more the entropy (disorder) the less our knowledge for the same number of particles. But now introduce one more particle with unknown position. The entropy increases a defined amount, but our knowledge is unchanged. That defined amount is useful in calculations.
 
  • #3
You asked for an I-level answer, so I am going to give you one, even though I suspect you are looking for a B-level answer.

Robert Webb said:
Surely order is the thing of interest

Why would you think that? Entropy is, up to a logarithm and a constant, the number of different microscopic states that give you the same macroscopic state. Why is that unimportant? And how would you even mathematically define "order"?
 
  • #4
Robert Webb said:
Something I've always wondered: why do we measure the amount of disorder (entropy) rather than the amount of order?
Entropy is defined to be ##k\ln\Omega##. Without a similar definition of "order", we have nothing to measure.
 
  • #5
Robert Webb said:
Something I've always wondered: why do we measure the amount of disorder (entropy) rather than the amount of order?
Here's a couple of questions for you: how do we measure entropy? How would you measure order?
 
  • Like
Likes Nugatory
  • #6
  • Like
Likes Jamison Lahman
  • #7
Robert Webb said:
Something I've always wondered: why do we measure the amount of disorder (entropy) rather than the amount of order?

That's a good question.

When Shannon was searching for a name for his new uncertainty function, von Neumann is reported to have said "You should call it entropy; for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage"

I think 'order' and 'disorder' are actually really quite difficult to define. Take the sequence of digits 35897932384 and 1234512345. We'd want to say that the second sequence is more 'ordered' because we recognize a pattern - yet the second sequence is just 10 consecutive digits of ##\pi## so there's some structure there too. Take the 4 Cartesian points (1,1), (1,-1), (-1,-1) and (-1,1) and plot them. We'd want to say that there's 'order' there because they're the vertices of a square - our eye is drawn to the pattern - but is there really more 'order' there than in the points (1,0), (17, -1/2), (-2,3) and (1/3, 7/5) ? I don't properly know how to answer that question. Is it just because we've evolved to be very good at seeing patterns and drawn to symmetries or is there really more 'order' in the first collection of points?

That's why I think the approach in terms of uncertainty is more satisfactory. If there are a large number of microstates that yield the same macrostate - then, given that macrostate, there is a lot of uncertainty about which particular microstate the system is actually in. The entropy is really then a measure of the number of available states subject to some constraints on the 'overall' properties (like energy, etc).

Consider flipping a fair coin 10 times - there are lots of ways of getting 5 heads and 5 tails - HHHHHTTTTT, HTHTHTHTHT, THTTTTHHHH, and so on - it would take a while to list them all. But there's only 1 way to get all heads. So in any 10 flips it's not likely we're going to get all heads - possible, but it occurs with a low probability compared to getting exactly 5 heads and 5 tails in 10 flips. So the entropy, subject to the constraint of getting 5 heads, is higher than the entropy subject to the constraint of getting 10 heads - there are many ways to get 5 heads from 10 flips, but only one way to get 10 heads from 10 flips.

Another way to say exactly the same thing would be to attempt to answer the question "given that I got ##n## heads in 10 flips, what was the actual sequence of heads and tails I got?" If ##n = 10## then we can answer that question precisely and there's no uncertainty at all. If, however, ##n = 5##, then we can't specify the actual sequence - we can rule out a lot of them, but we're still left with a fair degree of uncertainty about which particular sequence was obtained. Entropy is a measure of that uncertainty.
 

1. What is entropy?

Entropy is a measure of the amount of disorder or randomness in a system. It is a concept used in physics, chemistry, and information theory to describe the level of chaos or unpredictability in a system.

2. Why is it important to measure entropy?

Measuring entropy allows us to understand the amount of disorder in a system and how it changes over time. It helps us to predict the behavior of a system and make decisions about how to manipulate it. In information theory, entropy is used to measure the amount of information in a system and determine the efficiency of communication channels.

3. How is entropy measured?

In thermodynamics, entropy is measured in units of joules per kelvin (J/K). In information theory, it is measured in bits. The exact method of measurement depends on the specific system being studied, but generally involves calculating the probability of different states or arrangements within the system.

4. What is the relationship between entropy and disorder?

Entropy is often described as a measure of disorder because it represents the amount of randomness or chaos in a system. As entropy increases, the level of disorder also increases. However, it is important to note that entropy and disorder are not the same thing and are not directly interchangeable.

5. Can entropy be decreased?

In isolated systems, entropy tends to increase over time due to the natural tendency towards disorder. However, it is possible to decrease entropy in a local area by adding energy or controlling the arrangement of particles. Overall, the total entropy of the entire system will still increase, but it can be temporarily reduced in certain areas.

Similar threads

Replies
3
Views
963
  • Thermodynamics
Replies
2
Views
9K
  • Introductory Physics Homework Help
Replies
8
Views
937
Replies
12
Views
1K
Replies
4
Views
1K
  • Thermodynamics
Replies
5
Views
2K
Replies
7
Views
2K
Replies
8
Views
2K
  • Thermodynamics
Replies
1
Views
972
  • Special and General Relativity
Replies
7
Views
287
Back
Top