Entropy as a measure of ignorance

Click For Summary
SUMMARY

Entropy is defined as a measure of ignorance about a system, particularly in the context of thermodynamics and statistical mechanics. The discussion references Boltzmann's entropy formula, S = k log(W), where W represents the volume in phase space, and emphasizes the subjective nature of entropy based on the observer's knowledge. Key insights include the argument for using "spreading" as a descriptor for entropy, which encompasses spatial, temporal, and energetic dimensions. The conversation also highlights the relationship between entropy and energy dispersion, as articulated by Frank L. Lambert.

PREREQUISITES
  • Understanding of Boltzmann's entropy formula (S = k log(W))
  • Familiarity with phase space concepts in statistical mechanics
  • Knowledge of the second law of thermodynamics
  • Basic grasp of macroscopic observables (temperature, pressure, volume)
NEXT STEPS
  • Research the implications of Boltzmann's constant in thermodynamic systems
  • Explore the concept of phase space in greater depth
  • Study the relationship between entropy and energy dispersion in various states
  • Investigate the role of equilibrium in defining entropy and its implications
USEFUL FOR

Physicists, chemists, and students of thermodynamics and statistical mechanics seeking to deepen their understanding of entropy and its implications in energy systems.

Monsterboy
Messages
305
Reaction score
96
http://qr.ae/TUpoRU

"Entropy is a measure of our ignorance about a system " Is that accurate ?
 
Last edited:
Science news on Phys.org
"Entropy, Its Language, and Interpretation" by Harvey S. Leff (https://www.researchgate.net/publication/227218780_Entropy_Its_Language_and_Interpretation)

Abstract
The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially during processes and temporal spreading over accessible microstates states in thermodynamic equilibrium. Various examples illustrate the value of the spreading metaphor. To provide further support for this metaphor’s utility, it is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading.

"Entropy Is Simple — If We Avoid The Briar Patches!" by Frank L. Lambert (http://entropysimple.oxy.edu/content.htm#increase)

"The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature)."
 
  • Like
Likes   Reactions: Monsterboy
I have often heard people say "entropy depends on the observer." It is one of the reasons why the bouncing universe theory cannot be completely ruled out. I remember discussing this with (late) marcus. I am unable get the thread. Is the statement inside the quote accurate ?
 
There are several slightly different definitions of entropy. But one definition is Boltzmann's:

A system of many particles in classical physics is completely described by giving a location its location in "phase space". If you give the position and momentum of every single particle, then that gives the phase space location. That's a point in 6N dimensional space if there are N particles, because you have to specify:

  1. ##x_1, y_1, z_1, p_{x1}, p_{y1}, p_{z1}##
  2. ##x_2, y_2, z_2, p_{x2}, p_{y2}, p_{z2}##
  3. etc.
where ##x_j, y_j, z_j, p_{xj}, p_{yj}, p_{zj}## are the components of the position and momentum of particle number j.

Now, if you don't know precisely what all 6N values are giving the system's location in phase space, you can quantify your ignorance by giving a "volume" in phase space, meaning that the system has a location somewhere in that volume. Boltzmann defined the entropy of a system as ##S = k log(W)## where ##k## is Boltzmann's constant, and ##W## is the volume of the system in phase space, and ##log## means natural log. The bigger that number, the more uncertain you are about the precise location of the system in phase space.

This notion of entropy is subjective, because different people might have different amounts of information about the system, and might use a different volume in phase space.
 
  • Like
Likes   Reactions: Monsterboy
Monsterboy said:
I have often heard people say "entropy depends on the observer."

Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫Cp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
 
Lord Jestocost said:
Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫Cp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)

Well, that definition of entropy is a little circular, because ##T## is in turn defined via ##1/T = \frac{\partial S}{\partial U}|_{V}##.
 
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
 
Lord Jestocost said:
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.

The question is: how is ##T## defined?

In statistical mechanics, entropy is the primary quantity, and temperature is defined in terms of how entropy changes when you add a small amount of energy.
 
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 + ∫δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
 
  • #10
Lord Jestocost said:
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 + ∫δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).

Point taken, but there's another issue even after you've chosen the macroscopic variables. Given macroscopic variables ##E, V, N## (total energy, volume and number of particles), there are many (infinitely many in the classical case, and astronomically many in the quantum case) microstates consistent with that macrostate. But are they all equally likely? If not, what's the probability distribution?

You can just define "equilibrium" so that equal-likelihood is part of the definition, I suppose. Then your claims about entropy are objectively true for a system in equilibrium.
 
  • #11
I agree. Nonmechanical thermodynamic variables such as temperature and entropy are combined with “mechanical“ considerations from statistical mechanics on base of the concept of thermal equilibrium.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 5 ·
Replies
5
Views
613
  • · Replies 19 ·
Replies
19
Views
1K