Entropy is as simply as possible?

  • #1
FeDeX_LaTeX
Gold Member
437
13

Main Question or Discussion Point

Hello;

Can someone explain to be what entropy is as simply as possible? I've googled it and found loads of pages which I can't understand (with things about thermodynamics). I think it has something to do with the second law of thermodynamics, but I still don't know what it actually is.

Thanks
 

Answers and Replies

  • #2
115
1


Entropy is basically how messed up everything is. If it is orderly, the system is in a low state of entropy, if it is dis-orderly and messed up it has high entropy. You can also thing of entropy in terms of heat, the hotter something is, the higher its energy because the particles move more and don't form orderly patterns.

The best example is of a desk. In a state of low entropy all the pens and pencils are neatly arranged. In a state of high entropy they are randomly spread out. In a state of maximum entropy, they are randomly spread out, but the distribution across the desk is even.

The second law of thermodynamics states that in any spontaneous process (i.e No outside interference once it gets going) the entropy of the Universe will always increase, or get hotter, however you want to think about it. This applies to all reactions, even endothermic reactions - The surroundings get colder, but the reactants gain in entropy. That is why two liquids cannot combine to form a solid precipitate in an andothermic reaction.

Hope that helped
 
  • #3
1,015
1


Entropy is the measure how likely the system will stay "similar" with time.

For example, if all gas molecule are in one corner, they will surely spread. Therefore the system doesn't stay in the same state for long. The entropy is low.

If all gas molecules are spread out in a room, they are going to bounce off each other, but most likely the overall look stays the same (i.e. molecules randomly, but evenly spread out). In this case the entropy is high.

But note that Entropy can be defined for just about anything that has many configurations whose population changes with time.
 
  • #4
Mapes
Science Advisor
Homework Helper
Gold Member
2,593
17


Entropy is a measure of how many microstates (atomic arrangements and motions) are compatible with a system's macrostate (its pressure, volume, energy, etc., all the variables that we can observe). Although analogies like order/disorder can be useful, what I've given is the fundamental definition (quantified as [itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates).

For example, consider two counter-rotating wheels at a very low temperature (close to absolute zero) with total energy U due to the rotational kinetic energy. There's essentially only one microstate possible: each atom in the wheels rotating around the center axis. Since the temperature is so low, there's essentially no thermal energy. This is a low-entropy configuration.

Now imagine that the wheels are placed in contact so that they slow to a stop due to friction. The total energy is still U, but it is now entirely in the form of thermal energy. Since thermal energy involves random atomic motion, there are many, many, many possible atomic motions that would produce the thermal energy and finite temperature [itex]T>0\,\mathrm{K}[/itex] that we now measure. This is a high-entropy configuration.

The Second Law of Thermodynamics is merely the reasonable observation that if a system could either be in a high-entropy state or a low-entropy state, it will tend to be observed in a high-entropy state, simply because there are so many more microstates in the high-entropy state that the system can explore. A common analogy is a pair of dice: there are many ways to roll a total of six, but only one way to roll a total of twelve. So it's natural to conclude that we'll roll a total of six more often than twelve.

In gerenuk's gas example above, there are vastly more ways to arrange the molecules to fill the whole room compared to arranging them to fill half a room. In other words, there are many more microstates and therefore higher entropy corresponding to the first configuration. That's why we never, ever see the second configuration in reality for practical enclosure sizes.

Does this make sense?
 
  • #5
chiro
Science Advisor
4,790
131


Also I might add from an information theory point of view to what Mapes said is that if we want to find out say the number of binary bits to represent said information then we end up with ln M/ln 2. The reason is because we are solving for 2^(x) = M and by using logs we end up with the appropriate expression x = ln M/ln 2.

In essence entropy is a measure of how much "information" you need to represent the state of something. As above I showed you how to represent the state of a system in binary (number of bits). Typically we assign each state with a probability of occurence (sp?) and then weight each state by its probability and calculate what is known as the expectation of the "random variable".

If we assume that each individual state is equally likely as each other then we get a uniform distribution. Otherwise we will get a non-uniform distribution which means that some states are more likely than others. In cases where the distribution is highly non-uniform (ie some states occur a LOT more than others) then the entropy decreases as it becomes less "random" than if it were uniform.

Hope some of that helps
 
  • #6
FeDeX_LaTeX
Gold Member
437
13


Also I might add from an information theory point of view to what Mapes said is that if we want to find out say the number of binary bits to represent said information then we end up with ln M/ln 2.
What does M represent? Can you show me an example calculation?

Although analogies like order/disorder can be useful, what I've given is the fundamental definition (quantified as [itex]S\propto\ln\Omega[/itex], where [itex]\Omega[/itex] is the number of microstates
Can you show me an example calculation of how I would use this formula?
 
  • #7
chiro
Science Advisor
4,790
131


What does M represent? Can you show me an example calculation?



Can you show me an example calculation of how I would use this formula?
Lets think of this in the following way:

We have M possible microstates that are disjoint and each representing a different state where each state is "atomic" to every other (ie disjoint and indivisible).

Now lets say we represent the state using a binary representation. To find out how many bits we need given M microstates we solve the equation

2^x = M.

Lets think about if x is an integer for 0 to 2 how many states is represented.

If there is 1 state then we don't need any bits as the probability that we have that the state S, P(S) = 1. Therefore since we know what that state is with 100% certainty we need no information to represent it.

Lets say we have one bit. The number of states is 2 corresponding to {0,1}. Assuming a uniform distribution P(X = 0) = P(X = 1) = 1/2.

Now lets consider the state space of {00,01,10,11}. In this we have two bits which give us 4 microstates corresponding to 2^2 = 4.

Now using this pattern we can solve for the number of bits by using the reverse equation 2^x = M. Taking logs of both sides in base 2 we get x = log_2(M) = log M/log 2.
 
  • #8
107
1


Almost two centuries and we're still clarifying what entropy is, so don't worry if you don't get it fast! :) People here has given to you the "information theory" definition of entropy. But it is more than that. It is *also* (I mean, at the same time, no contradiction) a measure of how available energy is in your system. If you have energy, but it is highly entropic, then it is good for nothing. If entropy is low, you can get work from it. Check my blog entry https://www.physicsforums.com/blog.php?b=1864 [Broken]
 
Last edited by a moderator:
  • #9
Andy Resnick
Science Advisor
Education Advisor
Insights Author
7,352
1,793


I like seeing all the different answers- they illustrate that entropy is indeed not completely understood.

I like to think of entropy simply as 'energy unavailable to perform useful work'. Extracting work (either mechanical, chemical, informational, etc.) from a system changes the system somehow- specifically, it moves the system towards equilibrium. Once a system is in equilibrium, no further work can be extracted.

Heat engines operate between 2 temperatures and extract work based on the temperature *difference*. Chemical reactions proceed due to the difference in concentration between reactant(s) and product(s). Memory devices store information and can be erased.

The most general statement about the production of entropy during an arbitrary process is the Clausius-Duhem inequality, and it shows the intimate relationship between entropy, heat, temperature and dissipation. The Clausius-Duhem inequality

http://en.wikipedia.org/wiki/Clausius–Duhem_inequality

is one of the most fundamental mathematical statements regarding allowable physical processes.
 

Related Threads for: Entropy is as simply as possible?

  • Last Post
Replies
2
Views
4K
Replies
3
Views
2K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
4
Views
6K
Replies
4
Views
9K
  • Last Post
Replies
8
Views
3K
  • Last Post
Replies
2
Views
518
  • Last Post
Replies
10
Views
1K
Top