What is the true nature of entropy?

  • Thread starter Thread starter mishrashubham
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary
Entropy is a thermodynamic property that quantifies the energy unavailable for useful work in processes like energy conversion. It is often misunderstood as merely a measure of disorder, but this interpretation can be misleading; entropy is better viewed through the lens of statistical mechanics, where it represents the number of ways to arrange a system. The second law of thermodynamics, which states that entropy tends to increase, is a fundamental principle that applies universally, regardless of human perception. While "useful work" refers to energy that can perform tasks, entropy itself is a mathematical tool used to analyze thermodynamic processes. Understanding entropy requires a grasp of its definitions and implications in both classical thermodynamics and statistical mechanics.
  • #31
Science has a lot of concepts in it. Some of them are nearer to our everyday experience than others and you can sometimes fool yourself that you 'understand' something because it seems to relate directly to some everyday phenomenon or sensation. Not everything is lioke that though.
SO, for instance, we have a feel for what Mass is, or what a Force is but Temperature is a bit harder to nail and Special Relativity is even harder still.
In the end, the only way to appreciate these things as fully as possible is to use Maths. It's the only way have a hope of avoiding pitfalls and misconceptions. Yes, its a bit elitist but so is professional sport and music; not everyone can or will get into it deep enough to be an 'expert'. That's life. You just can't always expect a two word answer to what may seem to be a simple question.
 
Science news on Phys.org
  • #32
Hello, yuiop.

I am a bit worried about your piston example, perhaps you could clarify a few points?

What is the system in this case?

For the following I assume you mean the system to comprise the two gas chambers and the piston?

Now when the piston is moved sideways work is input to the system by whatever pushes the piston. Does this raise the temperature of either chamber and then what happens to this temperature?

When you release the piston you say 'work is done' Done on what? Does not the work done by the expanding gas in the high pressure chamber equal the work done compressing the gas in the low pressure one? Since they are both part of the system, what do you mean by work is done?

Since entropy is a state variable and you have taken the system round a complete cycle, returning it to its starting point, how does the total entropy differ at the end from the beginning?
 
  • #33
Entropy is the fact that, in a system, there tend to be many more states we call 'uninteresting' than states we call 'interesting'.
 
  • #34
mishrashubham said:
But doesn't that depend on where I want to put the division? Let us suppose I divide the box into 3 parts, left, right and middle. Then one of the four particles lying on the left half of your box may be situated in the middle of my box. If I go on increasing the number of divisions I will increase the number of ways to arrange the system thus reaching infinity.

the example was simplified; i assumed there were only two individual particle states ("on the left" or "on the right"). more generally, http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)#Counting_of_microstates" is not so easy, although in certain situations you can make a good approximation.

there is an alternate formula for calculating entropy through heat and temperature (dS = dQ/T) which is equivalent to the number of ways method (using S = k * ln(W)). you can use either formula to calculate the entropy, and in the situations where the approximate to the number of ways is good, both the formulas agree.
 
Last edited by a moderator:
  • #35
rcgldr said:
It never helps when scientists decide to use the same term to describe different things.

Potential energy is normally calculable, but not measurable. To "measure" gravitational potential energy, you'd have to have a device that measured weight and distance as it moved an object from one height to another.

In physics, there is little difference between ''measurement'' and ''calculating from a measurement''. This already holds for simple measurements of distances, if they are longer than what you can measure with a yardstick.

The distance between the town hall of two cities are measurable, according to common conventions about the meaning of the notion of measurement, though one needs to do some calculations (or have the tachometer calculate it for you). The same holds for the distance between the Earth and the moon - you measure the reflection time of a laser pulse, and calculate from it the distance.
 
  • #36
Mueiz said:
Entropy is defined as path integral of dQ/T
This is the precise definition of entropy
But there is something which is not less important than definition if you want to know the meaning of a physical concept.
It is the significance .
The significance of entropy is different in different branches of science:
Theromdynamics: entropy is a property of the system that increase in all isolated processes.
Statistical Physics:entropy is a measure to disorder.
Mechanical Enginearing:entropy is a measure for energy not available to useful work.
Philosophy [/B]:entropy is an arrow of time.
Biology: entropy is a measure for ageing problems
(In PF : entropy is a confusing concept about which many questions are asked and no satisfactory answer is found :smile:)

Haha too many uses for the same term

sophiecentaur said:
Science has a lot of concepts in it. Some of them are nearer to our everyday experience than others and you can sometimes fool yourself that you 'understand' something because it seems to relate directly to some everyday phenomenon or sensation. Not everything is like that though.
SO, for instance, we have a feel for what Mass is, or what a Force is but Temperature is a bit harder to nail and Special Relativity is even harder still.
In the end, the only way to appreciate these things as fully as possible is to use Maths. It's the only way have a hope of avoiding pitfalls and misconceptions. Yes, its a bit elitist but so is professional sport and music; not everyone can or will get into it deep enough to be an 'expert'. That's life. You just can't always expect a two word answer to what may seem to be a simple question.

Ah, guess it'll have to wait then, until I study calculus and the rest of the math. One more question, when do you officially study entropy in school?
 
  • #37
mishrashubham said:
guess it'll have to wait then, until I study calculus and the rest of the math. One more question, when do you officially study entropy in school?

I don't think it is officially studied there (but it may depend on where you live).

Chapters 4.1 and 7.6 of my book
Classical and Quantum Mechanics via Lie algebras
http://lanl.arxiv.org/abs/0810.1019
might be already understandable to you if you ignore all details that are over your head.
Feel free to ask for clarification...
 
  • #38
A. Neumaier said:
I don't think it is officially studied there (but it may depend on where you live).

Chapters 4.1 and 7.6 of my book
Classical and Quantum Mechanics via Lie algebras
http://lanl.arxiv.org/abs/0810.1019
might be already understandable to you if you ignore all details that are over your head.
Feel free to ask for clarification...

Thanks... I'll check your book to see if I can understand it.
 
  • #39
Entropy is even harder than the Offside Rule!
 
  • #40
All right sorry for resurrecting this thread, but this time I am back with some calculus and thermodynamics in my head. So after lots of reading, I have started getting a feel of what entropy is.

So currently I understand entropy like this, "It is a measure of the degree to which a system of particles is away from its most probable state." Is this right?
 
  • #41
I understand entropy as "the number of possible microstates in a given microstate" that don't **** it up. The classic example is an unbroken egg. Low entropy because if you start rearranging the particulars that constitute the egg you'll end up with something no longer descibed as an unbroken egg. But once it has been cracked into a bowl and whisked it had high entropy because the arrangement of the molecules throughout the bowl can be shuffled at random and it doesn't **** up it's whisked egginess. My problem is this, in this definition the terms of entropy seem to require a context and some external value needs tobe applied to what constitutes the limits of a given macrostate, and what it means to "**** it up". Not the most technical language but you get the idea. How can entropy be part of a fundamental law like the 2nd Law of Thermodynamics if entropy itself is relativistic in nature?
 
  • #42
mishrashubham said:
All right sorry for resurrecting this thread, but this time I am back with some calculus and thermodynamics in my head. So after lots of reading, I have started getting a feel of what entropy is.

So currently I understand entropy like this, "It is a measure of the degree to which a system of particles is away from its most probable state." Is this right?

No. What is most probable depends on the prior you are using, hence has no objective meaning.

Entropy is the expected number of decisions that would be needed to pin down a particular energy eigenstate, given the distribution of energy in the given state.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
7K
Replies
10
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 60 ·
3
Replies
60
Views
9K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 22 ·
Replies
22
Views
5K