Solving the Mystery of Entropy

In summary, entropy is a concept that can be difficult to understand and can appear to contradict the conservation of energy. However, it is based on the probability of a given state and is only applicable locally. The dissipation of energy as heat does not mean that it is destroyed, but rather it is simply not recoverable.
  • #1
martix
162
1
I've always found entropy a hard concept to grasp. Some time I read something that seems to make it more clear, then another time I read something else which completely disturbs my understanding of the idea.

My current problem with entropy is the violation of the the conservation laws. And the fact that there exist actually irreversible processes.
I read this wiki article that states that "During [state] transformation, there will be a certain amount of heat energy loss or dissipation due to intermolecular friction and collisions; energy that will not be recoverable if the process is reversed."
But that would mean energy loss and if taken on scale of the whole universe it means that energy is destroyed...
Explain please.
 
Science news on Phys.org
  • #2
That statement does not mean that energy is "destroyed"; it merely says that some of the energy after the transformation is in the form of heat (which is just "useless" energy).

The total energy of a closed system is always conserved; but whenever we use some of energy to do something useful some of that energy -regardless if it is electrical, chemical etc- will always be converted to heat.
 
  • #3
martix said:
My current problem with entropy is the violation of the the conservation laws. And the fact that there exist actually irreversible processes.

The concept of entropy is consistent with local reversible processes.

=[PLAIN]http://en.wikipedia.org/wiki/Irreversibility]this[/URL] wiki article that states that "During [state] transformation, there will be a certain amount of heat energy loss or dissipation due to intermolecular friction and collisions; energy that will not be recoverable if the process is reversed."
But that would mean energy loss and if taken on scale of the whole universe it means that energy is destroyed...
Explain please.

Entropy is based on the probability that a given state of a system can exist out of n possibilities. So if each state is equally probable, the probability of a given (observed) state is 1/n. This is usually expressed as the logarithmic function of p: S= -k ln(p) where k is a constant. In the thermodynamic case k is usually the Boltzmann constant. (In information theory the constant is usually 1 and the log base is 2). Entropy only has meaning (in the opinion of many) locally. Afaik modern physical theory doesn't attempt to describe the entropy of the whole universe.

In any case, energy is not destroyed. It's simply dissipated as heat. It may not be recoverable, but it is not destroyed.
 
Last edited by a moderator:
  • #4
SW VandeCarr did a good job of explaining it - I just wanted to add that if taking the logarithm of the number of accessible states seems mysterious, it is only done because it makes a lot of other math work out nicely. If you wanted to, you could define entropy as the number of accessible states (instead of the logarithm of it), but your math would end up being a lot uglier. Physically, though, the results would be identical.
 

1. What is entropy?

Entropy is a measure of disorder or randomness in a system. It is a concept that originated in thermodynamics, but has since been applied to various fields, including information theory and statistics.

2. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of an isolated system always increases over time. This means that systems tend to become more disordered and chaotic over time, rather than becoming more organized.

3. How do scientists measure entropy?

Entropy can be measured in different ways depending on the system being studied. In thermodynamics, it is often measured as the change in heat divided by the temperature. In information theory, it is measured as the amount of uncertainty in a system.

4. Can entropy be reversed?

The second law of thermodynamics states that the total entropy of an isolated system will always increase. However, in certain cases, entropy can be reduced in a local system by expending energy. For example, a living organism can maintain a low entropy state by expending energy to maintain its structure and order.

5. What practical applications does understanding entropy have?

Understanding entropy has practical applications in fields such as engineering, information theory, and biology. It helps us understand and predict the behavior of complex systems, such as chemical reactions and biological processes. It also plays a crucial role in the design of efficient and sustainable energy systems.

Similar threads

Replies
16
Views
847
Replies
11
Views
325
Replies
17
Views
1K
Replies
22
Views
2K
  • Thermodynamics
Replies
2
Views
772
Replies
10
Views
1K
  • Thermodynamics
Replies
1
Views
2K
Replies
21
Views
4K
Replies
8
Views
994
Back
Top