What is entropy? on the most fundamental level.

In summary, entropy is a measure of the randomness or lack of organization of a system. It is important to understand because it becomes useful when we make an assumption--that every allowable state of the entire interacting system is equally likely.
  • #1
CraigH
222
1
On the atomic level, what is entropy? how can I visualise it?

Thanks
 
Science news on Phys.org
  • #2
Entropy does not exist at a "fundamental level". It requires two levels - a "fundamental" or "microscopic" level, and a "coarse-grained" or "macroscopic" level. In an ideal gas, a state at the fundamental level is specified by stating the position and momentum of all the individual particles, while a state at the coarse-grained level is specified by stating pressure, temperature, volume as measured by a barometer, thermometer etc. Many different fundamental states correspond to the same coarse-grained state. The number of different fundamental states that correspond to a coarse-grained state is related to the entropy.
 
  • #3
The reply above is also described here:

http://en.wikipedia.org/wiki/Entropy#Thermodynamical_and_statistical_descriptions

I still remember the first time I heard about entropy in college: I became suspicious when the professor gave exactly the same material in class as was in the book. Either he did not want to confuse us with other perspectives or he did not understand entropy himself. [Richard Feynman has commented HIS professor did NOT understand entropy and Feynman later discovered some errors HE was taught. He has said he initially did not understand it either, but was 'best in his class at it'.] I came out of that one class session thinking "Either the prof is dopey on this subject or its me"...well I was wrong about that; I think we were both 'dopey' about it.

If you can 'visualize' entropy you are better than most. A usual perspective is putting gas in a container: entropy increases as the gas is uniformly distributed throughout the container. It's more random. That's increasing entropy. But the universe started that way, yet is 'clumping' into planets, stars, galaxies, etc, so why is THAT also entropy increasing. Well, it turns out that WITH gravitational effects, increasing entropy is a 'clumpy' not uniform distribution. So, for example a hot gas distributed in a galactic area and under the influence of gravity will CLUMP...and entropy increases.

When Claude Shannon first developed information theory at Bell Telephone Labs, he discussed his theory with a famous colleague...I can't think of the name..anyway his colleage told him "Call it 'entropy'...nobody understands that and you'll be in a much stronger position to defend your findings!",,or something very close to that! Entropy can be vuiewed as a subset of information theory: when one goes up the other goes down. If a gas is uniformly distributed in a conatiner, it's kind of 'uninteresting'...doesnlt hold much information...one part is pretty much the same as all the others. But if all the gas molecules were radnomly 'stuck in a corner' now THAT would be interesting!
 
Last edited:
  • #4
Okay, I seem to get the impression that the best way to describe entropy is the randomness of the particles, (be the particles atoms, or molecules or lumps of matter). So in a system if the particles are organised there is less entropy compared to if they are randomly distributed.
So what are the implications and consequences of this? why does this matter? why was this concept of entropy invented?
Inparticular what are the implications in IC engines, gas turbines, and steam generatotrs?
Are there any formulas where entropy is involved?
These are the three case studys that I am studdying, and I have an exam in three days so it will really help if someone could help me out with this!
 
  • #5
Do you find the concept of randomness any clearer than that of entropy?
Randomness is also a much misunderstood concept.
And how about the concept of information or the nature of 'states'?

I did like atyy's summary. It fitted a great deal into few words.

I would seriously advise getting a good hold of the thermodynamic version of entropy first as it is much easier to understand. This corresponds to atyy's macroscopic or coarse grained entropy.
 
Last edited:
  • #6
The most crucial thing to understand about the entropy concept is that it becomes useful only when you make a key assumption-- that every allowable state (that is, every state that satisfies all the known constraints you have about that system) of the entire interacting system (not a piece of the system, the whole enchilada) is equally likely. This is critical, because it means if you want to know the relative probability of a given class of states, all you have to do is compare the number of states in that class to the number of states in the other classes. The entropy is essentially the natural log of the number of states in the class, so it is the natural log of the "relative probability" of that coarse-grained class.

The reason we want the natural log, not the probability itself, is that we want entropies to add-- we want the entropy of two individual systems to add up to the entropy of the combined system. If we were using probabilities, we'd have to multiply them, because that's how probabilities work-- the probability of both classes being observed is the product of both probabilities. By taking the natural log, now the natural log of the product of probabilities is the sum of the natural log of each probability, so we gain additivity by using that definition of entropy.

So entropy gives us two things-- a way to compare probabilities of various classes of states, and an ability to add entropies of the components of a larger system to get the entropy of the larger system. That's what gives the concept its power. For example, we then get the second law of thermodynamics, which says that very large systems, whose classes of states involve ghastly many individual states, will never be seen to reduce their entropy, because they will not evolve from a more likely set of states to a less likely set of states (not this is not true of parts of the system, only the whole system). This follows simply from the fact that when the number of states involved are huge, the different classes of states will generally contain vastly different numbers of states, and so will have vastly different probabiliites of happening, and so the most likely will always be the reliable happening. Pieces of the system can evolve to less populated classes of states, but only if in so doing, the rest of the system gains access to even more heavily populated classes of states, such that the sum of the entropies does not decrease.
 
  • #7
Naty1 said:
Entropy can be vuiewed as a subset of information theory: when one goes up the other goes down. If a gas is uniformly distributed in a conatiner, it's kind of 'uninteresting'...doesnlt hold much information...one part is pretty much the same as all the others.
Believe it or not, information content is the same thing as Shannon entropy, so they both increase together. The idea that the more ordered state contains more information seems intuitive, but it's contrary to the formal meaning of information. One way to think of the "information" is the number of yes/no questions you would need to answer in order to specify the complete state of the system. If the particles are cloistered in one corner, that reduces the number of questions you need to ask to locate them all. Same with a series of binary digits-- a series that repeats contains less information, because once you see the pattern, you don't need to ask any more questions about it, but a random series of n digits requires the maximum, n, number of questions to specify.

So how does that make sense? If we are listening to a morse code, a monkey will send a series of random dots and dashes, but a person who knows Morse code will send a message, it will be less random. Surely the message contains more information? Actually, no, because the reason the message means something to us is precisely because it contains less information. Meaning doesn't come from information, it comes from redundancy-- if you repeat SOS over and over, I know that there is some meaning I am supposed to be extracting from that, at the expense of all the possible things you could have meant. So you are sacrificing information, which is a potentiality of meaning, in favor of redundancy, which is actual meaning. The less information you convey, the clearer I am about what meaning you are trying to convey, because I can use the redundancy to whittle down the potential meanings into the actual meaning. So meaning and information are not only very different, they are actually complementary concepts. When entropy is large, information content is large, even though meaning is minimal. Something that is highly ordered conveys meaning, by conveying far less information than would actually be possible with that number of characters or bits.
 
  • #8
Studiot said:
Do you find the concept of randomness any clearer than that of entropy?
Randomness is also a much misunderstood concept.
And how about the concept of information or the nature of 'states'?

I did like atyy's summary. It fitted a great deal into few words.

I would seriously advise getting a good hold of the thermodynamic version of entropy first as it is much easier to understand. This corresponds to atyy's macroscopic or coarse grained entropy.

I find randomness to be a very clear concept, at least when I think about it in terms of Kolmogorov complexity. I wonder if Clausius, Boltzmann, Gibbs, and others would have formulated entropy differently if this mathematical tool had existed a century earlier.

KC turns out to be another way to talk about information and entropy (both the entropy we get from physics, S, and one we get from info-theory, H), though in that context it's usually called algorithmic information theory, not KC.

That said, I think Studiot brings up a good point, as the relationship between information theory, randomness, and statistical mechanics/thermodynamics is an active area of research. (There's some pretty exciting stuff out there, starting with Landauer's[/PLAIN] [Broken] Principle). Hell, the relationship between information theory, randomness, and all of physics still isn't well understood.

In short, I'm with Studiot, you'd be better off learning about entropy using the traditional approach. But take it with a grain of salt, because nobody fully understands entropy anyways.
 
Last edited by a moderator:
  • #9
Thanks everyone, all great answers. I think iv got a good enough grasp of entropy to pass my exam now :)
 
  • #10
CraigH said:
On the atomic level, what is entropy? how can I visualise it?

Thanks

These are two entirely different questions. The precise definition of entropy on the atomic level involves probability and statistics. The mathematics of probability is often complex and confusing. In many practical problems, this is not a good way to visualize it.
However, scientists were successfully using the concept of entropy long before Boltzmann figured out the relationship between entropy and probability. They had a good way to visualize entropy on a macroscopic level that has nothing to do with probability. Let me present it.
Entropy is an indestructible fluid. Temperature is the pressure that entropy exerts on itself.
In fact, lots of scientists thought of it that way. For example, Lavoisier listed heat (i.e., entropy) as one of the elements. Carnot thought of caloric (i.e., entropy) as a fluid.
For a long time, scientists were divided over whether heat is a fluid or a state of motion. Actually, heat is both. This may be the first weird dichotomy in physics. The dichotomy between heat as a fluid and heat as a state of motion ranks right up there with wave-particle duality (in my opinion).
On the atomic level, heat is a form of motion. However, chemistry is not always done on an atomic level.
Note that all four laws of thermodynamics can be derived with these hypotheses. Note that entropy can be created in an irreversible reaction. However, it can never be destroyed.
This "fluid" visualization has one unintuitive hypothesis. In a reversible reaction, not entropy is created. In a reversible reaction, entropy is only moved.

This way of visualizing entropy has worked very well for me. I don't have links. However, here are citations to the two articles where I got this picture of entropy.
1. Hans U. Fuchs, "Entropy in the Teaching of Thermodynamics," American Journal of Physics 55(3), 215-219 (March 1987).
2. Hans U. Fuchs, "A Surrealist Tale of Electricity," 54(10). 907-909 (October 1987).

The second reference is more like a parody rather than an article. The historical parody was meant to motivate the historical presentation in the first article. The first article is mostly historical rather than mathematical. I don't know if there are any free links out there on these articles. I hope there are.

I checked some of the historical facts with some older articles. Again, I don't have any links. However, I did read English translations of:

3) Sadi Carnot, "Reflection on the Motive Power of Fire (1824).
4) E. Clapeyron, "Memoir on the Motive Power of Heat" (1834).

Dover has an inexpensive book showing English translations of the last too. Again, I can only hope there is a free link out there.
 
  • #11
Entropy does not exist. Here are two references:
PHYSICS ESSAYS 25, 2 (2012)
Entropy: A concept that is not a physical quantity
Shufeng Zhanga)
College of Physics, Central South University, 932 Lushan South Rd., Changsha 410083,
People’s Republic of China
(Received 5 September 2010; accepted 1 February 2012; published online 16 May 2012)
Abstract: This study shows that entropy is not a physical quantity, that is, the physical
quantity called ‘‘entropy’’ does not exist.

Entropy Does Not Exist
http://astranaut.org/blog/archives/2005_05_entropy_does_not_exist.php [Broken]
 
Last edited by a moderator:

1. What is the definition of entropy?

Entropy is a measure of the disorder or randomness of a system. It is a fundamental concept in thermodynamics and statistical mechanics, and it is used to describe the tendency of a system to move towards a state of maximum disorder.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to become more disordered and less organized over time, eventually reaching a state of maximum entropy.

3. What are some examples of entropy in everyday life?

Examples of entropy in everyday life include the melting of ice cubes, the rusting of metal, and the mixing of hot and cold fluids. In these cases, energy is dispersed and the system becomes more disordered, increasing its entropy.

4. How is entropy calculated?

The formula for calculating entropy is ΔS = Q/T, where ΔS is the change in entropy, Q is the amount of heat transferred, and T is the temperature in Kelvin. This formula is based on the relationship between entropy and energy dispersal.

5. Can entropy be reversed?

While it is possible for local decreases in entropy to occur, the overall trend is always towards an increase in entropy. This is due to the second law of thermodynamics, which states that all closed systems will naturally tend towards a state of maximum entropy. However, energy can be used to temporarily decrease the entropy of a system, such as in the formation of crystals or the organization of living organisms.

Similar threads

  • Thermodynamics
Replies
4
Views
273
  • Thermodynamics
Replies
3
Views
1K
Replies
12
Views
1K
Replies
13
Views
1K
  • Thermodynamics
Replies
3
Views
747
  • Thermodynamics
Replies
26
Views
1K
  • Thermodynamics
Replies
1
Views
697
  • Thermodynamics
Replies
2
Views
721
Replies
1
Views
478
Replies
16
Views
793
Back
Top