What is Feynman saying here? (entropy)

In summary: And I understand entropy should be a state variable, but is it obvious from the definition? Applied to this case: well first I don't understand why pulling an object faster will ask for more work, I would think less work. (Take in account that the distance from a to b is fixed). But anyway, the main thing is: going from a to b, you say, we can get different entropie increases, yet on the other hand, you say S is a state variable. Does this sentence not contradict itself?It is contradictory because entropy is a state variable that describes the amount of disorder in a system. However, in this case, the amount of disorder (entropy)
  • #1
nonequilibrium
1,439
2
So I bought myself the Feynman lectures and was reading up on his discussion on Statistical Mechanics and Thermodynamics (in volume I). The following is regarding Thermodynamics.

So he defined entropy (at least the change in it from a to b) as delta S = int(Q/T) over a reversible path from a to b. He then proved that all reversible paths from a to b give the same result (at least in the framework of gases). However:

"First, suppose that we do irreversible work on an object by friction, generating a heat Q on some object at temperature T. The entropy is increased by Q/T. The heat Q is equal to the work, and thus when we do a certain amount of work by friction against an object whose temperature is T, the entropy of the whole world increases by W/T."

This makes me wonder about two things:
1) I thought that for calculating delta S, you had to find a reversible path from start to end. Is it perhaps because of the following: to calculate the change in entropy for the floor alone, we can say heating it is in itself reversible (floors can cool down), so delta S_floor = Q_floor/T; identically for the object delta S_object = Q_object/T. This would give S = S_floor + S_object = Q/T. That does seem like a big jump to ignore in your reasoning (especially since it was his first example). But even if this is how it works, then that leads me to question 2
2) How do you know that it is well-defined then? For example, imagine a model for friction force where if you were to pull an object faster or slower, it would result in a different Q (for example because if you were to pull really slowly, you'd continously have to fight the larger static friction, then if you were to pull it reasonably fast, just giving the normal (and lower) kinetic friction). Following the reasoning from 1, we get different entropies for the same change from a to b.

All replies welcome,
mr. vodka
 
Science news on Phys.org
  • #2
A few thoughts that I hope will be helpful:

1. "Reversible" has a very specific definition in thermodynamics. It doesn't mean merely that a system can be brought to its original state (as in your floor example). It means that no gradients exist and therefore that no entropy is produced. It means that any heat transfer is accomplished by an infinitesimal temperature difference. Reversibility is an idealization that isn't possible in real life, though it can be approached.

2. Entropy is a state variable, so the entropy change from state A to state B is independent of reversibility. Once you've calculated if from a well-defined, reversible process, you can apply the same value to any irreversible process like friction.

3. It makes a difference how fast one pulls against a frictional force. The faster one pulls, the more work one does. Thus, we should not be surprised to calculate different entropy increases.

Does this help answer your questions?
 
  • #3
Thank you Mapes.

a) Do you mean the way I calculated it in the first post is wrong? If it is, then how can you calculate the entropy change after sliding an object over a floor with heat Q in a temperature T? How did Feynman get Q/T? What reversible path did he choose?

b) And I understand entropy should be a state variable, but is it obvious from the definition? Applied to this case: well first I don't understand why pulling an object faster will ask for more work, I would think less work. (Take in account that the distance from a to b is fixed). But anyway, the main thing is: going from a to b, you say, we can get different entropie increases, yet on the other hand, you say S is a state variable. Does this sentence not contradict itself?

EDIT: as you say, if there is no temperature gradient, it is reversible. Is this why Feynman said delta S = Q/T, because there was no temperature gradient? (as he said we held T constant) But... obviously friction is irreversible, what do we think of this?
 
Last edited:
  • #4
mr. vodka said:
a) Do you mean the way I calculated it in the first post is wrong? If it is, then how can you calculate the entropy change after sliding an object over a floor with heat Q in a temperature T? How did Feynman get Q/T? What reversible path did he choose?

Feynman's reversible path was heat transfer into an object at constant temperature T. It is assumed that the object is large enough that the temperature change from the transfer of energy Q is minimal. It is also assumed that the friction does negligible mechanical work to the object, so that any energy transfer is predominantly in the form of heat. (But note that mechanical work was applied externally to cause movement in the first place.)

mr. vodka said:
b) And I understand entropy should be a state variable, but is it obvious from the definition? Applied to this case: well first I don't understand why pulling an object faster will ask for more work, I would think less work. (Take in account that the distance from a to b is fixed). But anyway, the main thing is: going from a to b, you say, we can get different entropie increases, yet on the other hand, you say S is a state variable. Does this sentence not contradict itself?

No; if the object travels the same distance but you pull with more or less force, you're doing a different amount of work to the system, and thus the final entropy of the system is different. The physical endpoints a and b are the same, but the thermodynamic state B is different.

mr. vodka said:
EDIT: as you say, if there is no temperature gradient, it is reversible. Is this why Feynman said delta S = Q/T, because there was no temperature gradient? (as he said we held T constant) But... obviously friction is irreversible, what do we think of this?

The irreversibility is captured in the conversion from work to heat. Once you start dealing with the heat Q, you can consider the entropy increase to the rest of the object to be reversible. In other words, all of the entropy increase occurs in a localized (and poorly defined) region at the surface of the object through friction. We avoid detailed calculations of what's going on at this region (e.g., material deforming, bonds breaking) by assuming that the energy originally took the form of reversible work (which doesn't carry entropy), was entirely converted to heat (which does carry entropy), and is now propagating through an object at constant temperature (which is reversible and so transfers entropy but does not produce it).
 
  • #5
I see. Everything is clear now :) That was quick. (although I still think Feynman could've elaborated on his example)

Thank you very much!
 
  • #6
Feynman definitely moves very fast and loose in those lectures.

Glad to help!
 
  • #7
mr. vodka said:
"First, suppose that we do irreversible work on an object by friction, generating a heat Q on some object at temperature T. The entropy is increased by Q/T. The heat Q is equal to the work, and thus when we do a certain amount of work by friction against an object whose temperature is T, the entropy of the whole world increases by W/T."

There was a little misdirection here. He considers the transfer of energy from one object to another, but then talks about the entropy change "of the whole world".

First- friction is modeled as a 100% dissipative process. That means all work is converted into use*less* heat, regardless of how much work is done, or how long or how fast the process occurs.

So, since we have (deliberately) obscured the details of the transfer of energy, we can't calculate the change of entropy involved in that process for the two objects participating in the process. However, we invoke conservation of energy *for the whole world*, and can then talk about the change of entropy for the whole world- that work energy was completely and irretrievably lost to heat.
 

Related to What is Feynman saying here? (entropy)

1. What is entropy according to Feynman?

According to Feynman, entropy is a measure of the disorder or randomness in a system. It is a thermodynamic quantity that describes the distribution of energy within a system.

2. How does Feynman relate entropy to information theory?

Feynman states that entropy can also be thought of as a measure of the uncertainty or lack of information in a system. The more disordered a system is, the more information is needed to describe it, thus increasing its entropy.

3. What is the significance of Feynman's quote "Information is physical" in relation to entropy?

Feynman's quote implies that information is not just an abstract concept, but it has a physical presence in the form of entropy. This means that in order to process and store information, there must be some physical changes or transformations happening in the system, resulting in an increase in entropy.

4. How does entropy relate to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the entropy of a closed system will always tend to increase over time. This is because natural processes tend to move towards a state of higher disorder, resulting in an increase in entropy.

5. Is entropy always increasing?

In a closed system, entropy will always increase over time according to the Second Law of Thermodynamics. However, in an open system, where there is an exchange of matter and energy with the surroundings, the entropy can decrease locally as long as the overall entropy of the system and its surroundings increases.

Similar threads

Replies
11
Views
380
Replies
15
Views
1K
Replies
16
Views
868
Replies
12
Views
1K
Replies
56
Views
3K
Replies
13
Views
1K
  • Thermodynamics
Replies
2
Views
793
Replies
3
Views
1K
  • Thermodynamics
Replies
10
Views
308
Back
Top