What Does Entropy Really Mean in Scientific Terms?

  • Context: Graduate 
  • Thread starter Thread starter magicfountain
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around the meaning and implications of the equation dE = T*dS in the context of thermodynamics and entropy. Participants explore various interpretations of entropy, its definitions, and its relationship to temperature and energy exchange, with a focus on theoretical and conceptual aspects.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants suggest that dE = T*dS is a definition of temperature rather than a law, linking it to the principle of maximum entropy and Lagrange multipliers.
  • There is a question about how to assign a numerical value to entropy (S) and what the symbols in the equation represent.
  • One participant proposes that the change in entropy is defined as reversible heat flow divided by temperature, leading to the equation dQ = TdS.
  • Another participant describes a model involving two objects exchanging energy and how entropy can be viewed as a function of internal randomized energy, suggesting that heat flows from objects with higher entropy coefficients.
  • Concerns are raised about the characterization of entropy as a measure of "chaos" or "disorder," with one participant arguing that it should be viewed as a measure of ignorance related to physical constraints.

Areas of Agreement / Disagreement

Participants express differing views on the definition and interpretation of entropy, with no consensus reached on its meaning or implications. Some participants agree on certain mathematical relationships, while others contest the characterization of entropy.

Contextual Notes

Participants highlight the need for context when discussing entropy and its definitions, indicating that understanding may depend on specific theoretical frameworks or interpretations.

magicfountain
Messages
27
Reaction score
0
dE=T*dS
I don't understand this equation. Does anybody care to show me why that holds?
 
Science news on Phys.org
The equation is not a law so much as a definition (of temperature). Give me some idea of your context as I could give you chapters of books worth of exposition on the subject.

You can start with the idea of Lagrange multipliers and optimization of a quantity subject to constraints. Apply the principle of maximum entropy subject to the constraint that the average energy of a system be fixed to some value E and you get via the method of Lagrange multipliers this equation as a definition of temperature. The flow of heat from higher temperature body to lower temperature body is a behavior increasing entropy which links up the physical idea of temperature to the mathematical definition via Lagrange multipliers.

That is a starting point and you have to go through a lengthy exposition to get to this equation...that is if you want to understand it fully. How much understanding do you require?
 
i thought this was the definition of entropy.
how do you assign a numerical value to S (or how do you define it) then?
 
Professor Jambaugh did ask for the context.

I would add a further question.

What do you understand the symbols in your equation to stand for and do you understand what they are?
 
magicfountain said:
dE=T*dS
I don't understand this equation. Does anybody care to show me why that holds?
It comes from the first law of thermodynamics and the definition of entropy.

The change in entropy, dS is defined as the reversible heat flow, dQ, divided by temperature, T. So dQ = TdS.

From the first law, Q = ΔU + W where W is the work done by the system. I am not sure what dE is supposed to represent. If dE refers to the incremental change in internal energy (ie.: dU) then dE=TdS applies if δW=0.

AM
 
OK, try understanding it this way. Firstly take two objects and let them touch. Randomized energy will be exchanged between them. A small change in the energy of one will effect an opposite change in the energy of the other but the total energy will not change.

dE1 + dE2 = 0

Now assume there is an entropy for each object which is a function of the amount of internal randomized energy. S = S(E) and that this entropy is a somewhat additive measure of how randomized a system is. With the small changes in energy there will be a small change in each entropy and we add them:
dS = dS1 + dS2 = B1 dE1 + B2 dE2
where B1 and B2 are the rate coefficients expressing B = dS/dE. These coefficients are again functions of the energy and can change in value as the objects change internal energy.

Substitute in the zero net total energy change and you get:
dS = (B1 - B2)dE1
Assume the 2nd law, that the entropy will grow due to randomization until it reaches a maximum value at which point assuming entropy is a smooth function of energy, you must have the two coefficients equal B1 = B2.

Note that if B1 > B2 increasing the energy E1 increases entropy and vice versa. So we see that heat flows towards the object with the bigger B value. A bit more analysis shows us that the quantity T = 1/B such that:
dE = 1/B dS = T dS
is more useful in that for most objects T will increase with increasing energy.
So we can restate the 2nd law as Heat between two objects flows toward the larger B or the smaller T. In short heat flows from hotter to colder objects.

[edit: Cut out some further less relevant ramblings.]
 
thanks very much jambaugh
although the idea of assuming
"there is an entropy for each object which is a function of the amount of internal randomized energy. S = S(E) and that this entropy is a somewhat additive measure of how randomized a system is." seems tough to swallow, your explanation was very helpful and clear and gave me a good notion of what entropy is behind being a measure of "chaos".
 
I just posted a specific description of the meaning of entropy in another thread:
https://www.physicsforums.com/showthread.php?t=645089

I hate texts and professors who say entropy is a measure of "chaos" or of "disorder". It is not! It is a measure of ignorance.

But that sounds like it is observer dependent. (and yet "disorder" IS in the eye of the beholder!) It isn't because it is ignorance due to the degree to which the actual system is being physically constrained. The key virtue of science is that "knowledge" has operational meaning in the form of specific measurements made and constraints imposed. We don't "know" because we have a channel to the mystic realm. We know because we physical observe or have physically imposed constraints. Knowledge becomes a physical process. Thus Entropy as an index of relative knowledge (actually ignorance) has perfectly good physical meaning.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 19 ·
Replies
19
Views
7K
  • · Replies 26 ·
Replies
26
Views
3K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K