Exploring Alternative Definitions and Applications of Entropy

  • Thread starter fluidistic
  • Start date
  • Tags
    Entropy
In summary, the definition of entropy in thermodynamics and statistical mechanics may be different, but they give the same value for a system. However, these definitions are not applicable for systems with only two bodies involved. In the example given, the entropy does not increase and there is no arrow of time. Entropy arguments are only valid for macroscopic systems and may fail to a small extent. Additionally, for any other process, the state-return-time is much shorter.
  • #1
fluidistic
Gold Member
3,923
261
After having completed 2 years of a Bachelor's degree, the only definition of entropy I know is [tex]\Delta S = \int_1^2 \frac{dQ}{T}[/tex]. I realize it's the change of entropy rather than the entropy of a system.

My question is "Is this the only definition of entropy"? I've seen in wikipedia and Fundamentals of Physics (Resnick-Halliday) the definition [tex]S=k \ln \Omega[/tex] but I never learned it nor do I understand what it means.

In the case of 2 isolated iron spheres in the Universe that are separated by a distance d. When they get closer and closer to each other, does the entropy of the system (the 2 spheres) increases? With the definition I have from entropy, heat is not involved so the formula is not useful.
Because I realize that the 2 spheres will only get closer and closer and if I could record a film of the motion, I'd realize instantly if the film passes reversely or not.
Assuming that yes, entropy increases in the example... I have another question:

Do you buy that it's IMPOSSIBLE for a sphere not to get closer to the other?

It's different than in the case of having a gas confined into a 1 m^3 cube we want to know whether the film passes reversely or not. Because there is a small probability that say 10^24 particles are confined into 10 cm^3 rather than in 1m^3 which is the volume of the container. Hence looking at the film, although I could almost always be right in telling the direction of the film, I could still be wrong (ok, I realize this won't happen within a very large time, even much greater than the current age of the Universe but I consider this as a possibility-improbability and not as an impossibility).
I wrote the last paragraph to show a distinction between what I consider impossible and what I consider improbable but possible. I would like to know if it is possible but improbable that the 2 spheres reduce their acceleration even for a very short time when they're getting closer to each other. I believe it's impossible and would violate Newton's laws (I'm pretty sure that it would also violate Relativity ones).

Thanks for all.
 
Science news on Phys.org
  • #2
Hi.
The definitions of entropy in thermodynamics and of statistical mechanics are different but they give identical value to a system.
Both thermodynamics and statistical mechanics are for systems of great many particles. Both are not applicable in your case of only two bodies involved. Gravity between the bodies could make them closer.
Regards.
 
Last edited:
  • #3
fluidistic said:
My question is "Is this the only definition of entropy"? I've seen in wikipedia and Fundamentals of Physics (Resnick-Halliday) the definition [tex]S=k \ln \Omega[/tex] but I never learned it nor do I understand what it means.
The second can be used to derive the first, but no vice versa (to my knowledge).
I posted an outline of the proof in
https://www.physicsforums.com/showthread.php?t=353528&page=2

You can also see
https://www.physicsforums.com/showthread.php?t=365823
first, to understand what the number of possible realizations [itex]\Omega[/itex] means.

You are right, that the first thermodynamics definition is useful for common thermodynamics only.

fluidistic said:
In the case of 2 isolated iron spheres in the Universe that are separated by a distance d. When they get closer and closer to each other, does the entropy of the system (the 2 spheres) increases? With the definition I have from entropy, heat is not involved so the formula is not useful.
In this example neither the entropy increases, nor is there an arrow of time. The iron sphere might well be together but have an outwards momentum, which would reverse the whole movie.

Entropy arguments only work for macroscopic systems and even then fail to a small extend. See the references given in
https://www.physicsforums.com/showthread.php?t=364225&page=2

And moreover a bunch of particles released in the corner of and empty room will return there after a very very long time. Movies may well be reversed, but it just happens that almost all of the possible movies increase entropy.

fluidistic said:
It's different than in the case of having a gas confined into a 1 m^3 cube we want to know whether the film passes reversely or not. Because there is a small probability that say 10^24 particles are confined into 10 cm^3 rather than in 1m^3 which is the volume of the container. Hence looking at the film, although I could almost always be right in telling the direction of the film, I could still be wrong (ok, I realize this won't happen within a very large time, even much greater than the current age of the Universe but I consider this as a possibility-improbability and not as an impossibility).
This is correct. For any other process it is even worse in the sense that the state-return-time is much shorter.
 
  • #4
Ok thank you both. It's much clearer in my mind now.
 

Related to Exploring Alternative Definitions and Applications of Entropy

1. What is entropy and why is it important in science?

Entropy is a measure of disorder or randomness in a system. It is important in science because it helps us understand the direction of natural processes, such as chemical reactions and energy transfer. It also plays a crucial role in the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time.

2. Can entropy ever decrease?

Yes, it is possible for entropy to decrease in a localized system, such as a living organism or a plant growing from a seed. However, in the overall system, the total entropy will always increase.

3. How does entropy relate to information theory?

In information theory, entropy is a measure of the uncertainty or randomness in a system. It is used to quantify the amount of information contained in a message or signal. The more uncertain or random the message, the higher the entropy.

4. Is entropy the same as chaos?

No, entropy and chaos are not the same. Entropy measures the amount of disorder or randomness in a system, while chaos refers to a state of extreme unpredictability or instability. A system with high entropy can still exhibit ordered behavior, while a chaotic system can have low entropy.

5. How can entropy be applied in different fields of science?

Entropy has applications in various fields of science, including thermodynamics, information theory, biology, and chemistry. In thermodynamics, it is used to understand the direction of natural processes. In information theory, it helps measure the amount of information in a message. In biology, it is used to study the organization and behavior of living systems. In chemistry, it is important in understanding chemical reactions and the stability of molecular structures.

Similar threads

Replies
12
Views
1K
  • Thermodynamics
Replies
1
Views
744
Replies
13
Views
1K
  • Thermodynamics
Replies
3
Views
796
Replies
17
Views
1K
Replies
22
Views
2K
  • Thermodynamics
Replies
2
Views
792
Replies
1
Views
507
Replies
9
Views
1K
Back
Top