Brief explanation of what entropy is?

In summary, entropy is a measure of the disorder or randomness in a system. It is often referred to as the "arrow of time" as it describes the tendency of systems to move from a state of order to a state of disorder over time. In thermodynamics, it is a measure of the amount of energy that is unavailable for work in a system. In information theory, it measures the uncertainty or randomness in a message. Entropy is also used in various fields such as physics, chemistry, and biology to describe the level of disorder and the direction of change in a system. Overall, entropy plays a fundamental role in understanding the behavior and evolution of systems in the natural world.
  • #1
Benny
584
0
Hi, could someone please give me a brief explanation of what entropy is? Due to my limited understanding what entropy actually is, most of the things I've read about it seem very vague. I've only been seeing entropy popping up in integrals and in sometimes in comments along the lines of entropy being a property of the system.

But is that all there is to it? I mean surely there must be some kind of 'physical interpretation.' I can relate things like heat transfer and work done to other things I know about but I just can't grasp what entropy is.

Any help would be good thanks.
 
Physics news on Phys.org
  • #2
Entropy is the randomness in a system. For example in chemistry, the randomness goes like this:
gases>>solutions>liquids>>solids

the formula is: dS = δQ / T where S is the entropy, δQ is the amount of heat absorbed in a reversible process, and T is the temperature.
 
  • #3
Benny said:
Hi, could someone please give me a brief explanation of what entropy is? Due to my limited understanding what entropy actually is, most of the things I've read about it seem very vague. I've only been seeing entropy popping up in integrals and in sometimes in comments along the lines of entropy being a property of the system.

But is that all there is to it? I mean surely there must be some kind of 'physical interpretation.' I can relate things like heat transfer and work done to other things I know about but I just can't grasp what entropy is.

Any help would be good thanks.
Entropy is not an easy concept to grasp. There are a lot of subtleties. I would suggest you read as much as you can about it, starting with the history (see http://en.wikipedia.org/wiki/Entropy" , which is pretty good).

Absolute entropy is not really a very useful concept. So don't worry too much about trying to understand what 'entropy' is physically. Mathematically, it is Q/T. But change in entropy is useful. It tells us which direction a thermodynamic process will naturally go. It tells us how much of the heat can be used to perform useful work.

Good luck.

AM
 
Last edited by a moderator:
  • #4
Thanks for the explanations guys, much appreciated.
 
  • #5
The 2nd Law of Thermodynamics states that the amount of entropy always increases with time in the universe. As the universe expands, it gets more and more disordered.

Its sort of something to do with the idea that if you leave your room unmanaged, pretty much you'll find it quite messy after a week. Similarly, a cup amde of china is in an ordered state, but when you break it entropy has increased since it is less ordered.
 
  • #6
QuantumCrash said:
The 2nd Law of Thermodynamics states that the amount of entropy always increases with time in the universe. As the universe expands, it gets more and more disordered.

Its sort of something to do with the idea that if you leave your room unmanaged, pretty much you'll find it quite messy after a week. Similarly, a cup amde of china is in an ordered state, but when you break it entropy has increased since it is less ordered.
You should be very cautious about equating entropy to disorder. You have to define disorder. For a thermodynamic system which, by definition, is constantly changing, it is not easy to see how a change in 'order' occurs from one to another. It is hard to see "order" in an inherently chaotic system. It is better, in my view, to think of entropy as a measure of how disperse the energy is. As energy becomes more dispersed, the entropy increases. As a hot small object transfers its heat to a large cooler object, the energy is less concentrated - entropy increases.

Have a look at the paper http://www.entropysite.com/teaching_entropy.html"

AM
 
Last edited by a moderator:
  • #7
QuantumCrash said:
Its sort of something to do with the idea that if you leave your room unmanaged, pretty much you'll find it quite messy after a week. Similarly, a cup amde of china is in an ordered state, but when you break it entropy has increased since it is less ordered.

You appear to be ascribing order through a humanist point of view which isn't correct, remember the amount of 'energy' that's required to fix the cup in the analogy you used. This is fundamental in understanding entropy within a physical system.

Somebodies been reading 'A Brief History' :wink:
 
  • #8
I know that, but entropy has been quite often defined as a 'measure of disorder', has it not, though the accuracy of this is apparently questionable. Indeed, I don't think there is any particular quantity that actually measures disorder, but surely you can admit that it comes close.o:)

Probably, it would be better to say that entropy would show what is most probable than not.

Well, I would have thougt a 'humanist's' POV albeit a less mathematical one than the formal matematical definition would be a nice starter rather than jumping in and trying to understand it.

However, perhaps someone might explain to me exactly why quite a number of people (including Hawking:wink: ) attempts to describe it using disorder then if it is misleading.
 
  • #9
QuantumCrash said:
I know that, but entropy has been quite often defined as a 'measure of disorder', has it not, though the accuracy of this is apparently questionable. Indeed, I don't think there is any particular quantity that actually measures disorder, but surely you can admit that it comes close.o:)

Probably, it would be better to say that entropy would show what is most probable than not.

Well, I would have thougt a 'humanist's' POV albeit a less mathematical one than the formal matematical definition would be a nice starter rather than jumping in and trying to understand it.

However, perhaps someone might explain to me exactly why quite a number of people (including Hawking:wink: ) attempts to describe it using disorder then if it is misleading.
That was the popular way to explain entropy when he was taught thermodynamics. It was not clear then and it is still not clear.

Notice that Hawking does not provide a definition of disorder. Indeed some of his examples are misleading. A container of air with all the air in one half is not necessarily more ordered than a container with air evenly distributed throughout the whole volume. There is chaos at the microscopic level in both. But the energy is more concentrated in the smaller volume and more dispersed in the larger volume.

AM
 

1. What is entropy?

Entropy is a measure of disorder or randomness in a system. It is a concept from thermodynamics that describes the tendency of systems to move from a state of order to a state of disorder.

2. How does entropy relate to energy?

Entropy and energy are closely related. As energy is transferred or converted within a system, the level of entropy typically increases. This means that energy tends to spread out and become less concentrated, leading to a more disordered state.

3. Can entropy decrease?

In closed systems, entropy tends to increase over time due to the second law of thermodynamics. However, in open systems where energy and matter can enter and leave, localized decreases in entropy can occur. This is known as negative entropy or "negentropy."

4. What are some real-life examples of entropy?

Examples of entropy can be seen all around us, such as a messy room becoming messier over time, a hot cup of coffee cooling down, or an ice cube melting in a glass of water. In all of these cases, the system is moving towards a more disordered state, with entropy increasing.

5. How is entropy used in other fields besides thermodynamics?

Entropy is a fundamental concept in thermodynamics, but it also has applications in other areas such as information theory and computer science. In these fields, entropy is used to measure the amount of uncertainty or randomness in a system or in a set of data.

Similar threads

  • Introductory Physics Homework Help
Replies
3
Views
724
Replies
13
Views
1K
  • Thermodynamics
Replies
2
Views
770
  • Introductory Physics Homework Help
Replies
5
Views
763
Replies
15
Views
1K
  • Introductory Physics Homework Help
Replies
3
Views
884
  • Astronomy and Astrophysics
Replies
1
Views
1K
Replies
9
Views
1K
  • Introductory Physics Homework Help
Replies
1
Views
797
Back
Top