What Does Entropy Really Mean in Scientific Terms?

  • Thread starter Thread starter magicfountain
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on the equation dE=T*dS, which defines temperature in thermodynamics through the lens of entropy. Participants explain that this equation emerges from the first law of thermodynamics and the definition of entropy, where dS represents the change in entropy as a function of reversible heat flow divided by temperature. The conversation highlights the importance of Lagrange multipliers in optimizing entropy under energy constraints and clarifies that entropy should be viewed as a measure of ignorance rather than disorder. This perspective emphasizes the operational meaning of knowledge in scientific contexts.

PREREQUISITES
  • Understanding of thermodynamics, specifically the first law of thermodynamics.
  • Familiarity with the concept of entropy and its mathematical representation.
  • Knowledge of Lagrange multipliers and their application in optimization problems.
  • Basic grasp of heat transfer principles and energy conservation.
NEXT STEPS
  • Study the application of Lagrange multipliers in thermodynamic systems.
  • Explore the relationship between entropy and information theory.
  • Learn about the implications of the second law of thermodynamics on energy flow.
  • Investigate the concept of entropy as a measure of ignorance in physical systems.
USEFUL FOR

Students of physics, thermodynamics researchers, and professionals in scientific fields seeking a deeper understanding of entropy and its implications in thermodynamic processes.

magicfountain
Messages
27
Reaction score
0
dE=T*dS
I don't understand this equation. Does anybody care to show me why that holds?
 
Science news on Phys.org
The equation is not a law so much as a definition (of temperature). Give me some idea of your context as I could give you chapters of books worth of exposition on the subject.

You can start with the idea of Lagrange multipliers and optimization of a quantity subject to constraints. Apply the principle of maximum entropy subject to the constraint that the average energy of a system be fixed to some value E and you get via the method of Lagrange multipliers this equation as a definition of temperature. The flow of heat from higher temperature body to lower temperature body is a behavior increasing entropy which links up the physical idea of temperature to the mathematical definition via Lagrange multipliers.

That is a starting point and you have to go through a lengthy exposition to get to this equation...that is if you want to understand it fully. How much understanding do you require?
 
i thought this was the definition of entropy.
how do you assign a numerical value to S (or how do you define it) then?
 
Professor Jambaugh did ask for the context.

I would add a further question.

What do you understand the symbols in your equation to stand for and do you understand what they are?
 
magicfountain said:
dE=T*dS
I don't understand this equation. Does anybody care to show me why that holds?
It comes from the first law of thermodynamics and the definition of entropy.

The change in entropy, dS is defined as the reversible heat flow, dQ, divided by temperature, T. So dQ = TdS.

From the first law, Q = ΔU + W where W is the work done by the system. I am not sure what dE is supposed to represent. If dE refers to the incremental change in internal energy (ie.: dU) then dE=TdS applies if δW=0.

AM
 
OK, try understanding it this way. Firstly take two objects and let them touch. Randomized energy will be exchanged between them. A small change in the energy of one will effect an opposite change in the energy of the other but the total energy will not change.

dE1 + dE2 = 0

Now assume there is an entropy for each object which is a function of the amount of internal randomized energy. S = S(E) and that this entropy is a somewhat additive measure of how randomized a system is. With the small changes in energy there will be a small change in each entropy and we add them:
dS = dS1 + dS2 = B1 dE1 + B2 dE2
where B1 and B2 are the rate coefficients expressing B = dS/dE. These coefficients are again functions of the energy and can change in value as the objects change internal energy.

Substitute in the zero net total energy change and you get:
dS = (B1 - B2)dE1
Assume the 2nd law, that the entropy will grow due to randomization until it reaches a maximum value at which point assuming entropy is a smooth function of energy, you must have the two coefficients equal B1 = B2.

Note that if B1 > B2 increasing the energy E1 increases entropy and vice versa. So we see that heat flows towards the object with the bigger B value. A bit more analysis shows us that the quantity T = 1/B such that:
dE = 1/B dS = T dS
is more useful in that for most objects T will increase with increasing energy.
So we can restate the 2nd law as Heat between two objects flows toward the larger B or the smaller T. In short heat flows from hotter to colder objects.

[edit: Cut out some further less relevant ramblings.]
 
thanks very much jambaugh
although the idea of assuming
"there is an entropy for each object which is a function of the amount of internal randomized energy. S = S(E) and that this entropy is a somewhat additive measure of how randomized a system is." seems tough to swallow, your explanation was very helpful and clear and gave me a good notion of what entropy is behind being a measure of "chaos".
 
I just posted a specific description of the meaning of entropy in another thread:
https://www.physicsforums.com/showthread.php?t=645089

I hate texts and professors who say entropy is a measure of "chaos" or of "disorder". It is not! It is a measure of ignorance.

But that sounds like it is observer dependent. (and yet "disorder" IS in the eye of the beholder!) It isn't because it is ignorance due to the degree to which the actual system is being physically constrained. The key virtue of science is that "knowledge" has operational meaning in the form of specific measurements made and constraints imposed. We don't "know" because we have a channel to the mystic realm. We know because we physical observe or have physically imposed constraints. Knowledge becomes a physical process. Thus Entropy as an index of relative knowledge (actually ignorance) has perfectly good physical meaning.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
1
Views
1K
Replies
5
Views
4K
Replies
4
Views
1K
Replies
4
Views
1K