Entropy as a measure of ignorance

Click For Summary

Discussion Overview

The discussion revolves around the concept of entropy, particularly its interpretation as a measure of ignorance about a system. Participants explore various definitions, implications, and the relationship between entropy and the observer's perspective, touching on theoretical and conceptual aspects of thermodynamics and statistical mechanics.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants question the accuracy of describing entropy as a measure of ignorance about a system.
  • Others reference the work of Harvey S. Leff, discussing the language of entropy and the metaphor of "spreading" as a potentially clearer descriptor.
  • One participant mentions that the statement "entropy depends on the observer" is often cited, linking it to the bouncing universe theory.
  • Another participant explains Boltzmann's definition of entropy, emphasizing its subjective nature based on the observer's knowledge of the system's phase space.
  • Concerns are raised about the circularity of definitions involving temperature and entropy, particularly in the context of Clausius's definition.
  • Some participants discuss the relationship between entropy and energy, referencing Clausius's original definition and its implications in statistical mechanics.
  • There is a debate about whether all microstates consistent with a macrostate are equally likely, and how this affects the definition of entropy.
  • One participant agrees that nonmechanical thermodynamic variables are combined with mechanical considerations from statistical mechanics based on thermal equilibrium.

Areas of Agreement / Disagreement

Participants express differing views on whether entropy depends on the observer, with no clear consensus reached. The discussion includes multiple competing interpretations and definitions of entropy.

Contextual Notes

Participants highlight the complexity of defining entropy, noting variations in definitions and the potential circularity in some explanations. The discussion also reflects uncertainty regarding the likelihood of microstates associated with macrostates.

Monsterboy
Messages
305
Reaction score
96
http://qr.ae/TUpoRU

"Entropy is a measure of our ignorance about a system " Is that accurate ?
 
Last edited:
Science news on Phys.org
"Entropy, Its Language, and Interpretation" by Harvey S. Leff (https://www.researchgate.net/publication/227218780_Entropy_Its_Language_and_Interpretation)

Abstract
The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially during processes and temporal spreading over accessible microstates states in thermodynamic equilibrium. Various examples illustrate the value of the spreading metaphor. To provide further support for this metaphor’s utility, it is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading.

"Entropy Is Simple — If We Avoid The Briar Patches!" by Frank L. Lambert (http://entropysimple.oxy.edu/content.htm#increase)

"The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature)."
 
  • Like
Likes   Reactions: Monsterboy
I have often heard people say "entropy depends on the observer." It is one of the reasons why the bouncing universe theory cannot be completely ruled out. I remember discussing this with (late) marcus. I am unable get the thread. Is the statement inside the quote accurate ?
 
There are several slightly different definitions of entropy. But one definition is Boltzmann's:

A system of many particles in classical physics is completely described by giving a location its location in "phase space". If you give the position and momentum of every single particle, then that gives the phase space location. That's a point in 6N dimensional space if there are N particles, because you have to specify:

  1. ##x_1, y_1, z_1, p_{x1}, p_{y1}, p_{z1}##
  2. ##x_2, y_2, z_2, p_{x2}, p_{y2}, p_{z2}##
  3. etc.
where ##x_j, y_j, z_j, p_{xj}, p_{yj}, p_{zj}## are the components of the position and momentum of particle number j.

Now, if you don't know precisely what all 6N values are giving the system's location in phase space, you can quantify your ignorance by giving a "volume" in phase space, meaning that the system has a location somewhere in that volume. Boltzmann defined the entropy of a system as ##S = k log(W)## where ##k## is Boltzmann's constant, and ##W## is the volume of the system in phase space, and ##log## means natural log. The bigger that number, the more uncertain you are about the precise location of the system in phase space.

This notion of entropy is subjective, because different people might have different amounts of information about the system, and might use a different volume in phase space.
 
  • Like
Likes   Reactions: Monsterboy
Monsterboy said:
I have often heard people say "entropy depends on the observer."

Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫Cp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
 
Lord Jestocost said:
Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫Cp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)

Well, that definition of entropy is a little circular, because ##T## is in turn defined via ##1/T = \frac{\partial S}{\partial U}|_{V}##.
 
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
 
Lord Jestocost said:
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.

The question is: how is ##T## defined?

In statistical mechanics, entropy is the primary quantity, and temperature is defined in terms of how entropy changes when you add a small amount of energy.
 
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 + ∫δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
 
  • #10
Lord Jestocost said:
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 + ∫δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).

Point taken, but there's another issue even after you've chosen the macroscopic variables. Given macroscopic variables ##E, V, N## (total energy, volume and number of particles), there are many (infinitely many in the classical case, and astronomically many in the quantum case) microstates consistent with that macrostate. But are they all equally likely? If not, what's the probability distribution?

You can just define "equilibrium" so that equal-likelihood is part of the definition, I suppose. Then your claims about entropy are objectively true for a system in equilibrium.
 
  • #11
I agree. Nonmechanical thermodynamic variables such as temperature and entropy are combined with “mechanical“ considerations from statistical mechanics on base of the concept of thermal equilibrium.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K