Entropy as a measure of ignorance

  • I
  • Thread starter Monsterboy
  • Start date
  • #1
247
79
http://qr.ae/TUpoRU

"Entropy is a measure of our ignorance about a system " Is that accurate ?
 
Last edited:

Answers and Replies

  • #2
Lord Jestocost
Gold Member
684
483
"Entropy, Its Language, and Interpretation" by Harvey S. Leff (https://www.researchgate.net/publication/227218780_Entropy_Its_Language_and_Interpretation)

Abstract
The language of entropy is examined for consistency with its mathematics and physics, and for its efficacy as a guide to what entropy means. Do common descriptors such as disorder, missing information, and multiplicity help or hinder understanding? Can the language of entropy be helpful in cases where entropy is not well defined? We argue in favor of the descriptor spreading, which entails space, time, and energy in a fundamental way. This includes spreading of energy spatially during processes and temporal spreading over accessible microstates states in thermodynamic equilibrium. Various examples illustrate the value of the spreading metaphor. To provide further support for this metaphor’s utility, it is shown how a set of reasonable spreading properties can be used to derive the entropy function. A main conclusion is that it is appropriate to view entropy’s symbol S as shorthand for spreading.

"Entropy Is Simple — If We Avoid The Briar Patches!" by Frank L. Lambert (http://entropysimple.oxy.edu/content.htm#increase)

"The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature)."
 
  • Like
Likes Monsterboy
  • #3
247
79
I have often heard people say "entropy depends on the observer." It is one of the reasons why the bouncing universe theory cannot be completely ruled out. I remember discussing this with (late) marcus. I am unable get the thread. Is the statement inside the quote accurate ?
 
  • #4
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,490
2,629
There are several slightly different definitions of entropy. But one definition is Boltzman's:

A system of many particles in classical physics is completely described by giving a location its location in "phase space". If you give the position and momentum of every single particle, then that gives the phase space location. That's a point in 6N dimensional space if there are N particles, because you have to specify:

  1. ##x_1, y_1, z_1, p_{x1}, p_{y1}, p_{z1}##
  2. ##x_2, y_2, z_2, p_{x2}, p_{y2}, p_{z2}##
  3. etc.
where ##x_j, y_j, z_j, p_{xj}, p_{yj}, p_{zj}## are the components of the position and momentum of particle number j.

Now, if you don't know precisely what all 6N values are giving the system's location in phase space, you can quantify your ignorance by giving a "volume" in phase space, meaning that the system has a location somewhere in that volume. Boltzmann defined the entropy of a system as ##S = k log(W)## where ##k## is Boltzmann's constant, and ##W## is the volume of the system in phase space, and ##log## means natural log. The bigger that number, the more uncertain you are about the precise location of the system in phase space.

This notion of entropy is subjective, because different people might have different amounts of information about the system, and might use a different volume in phase space.
 
  • Like
Likes Monsterboy
  • #5
Lord Jestocost
Gold Member
684
483
I have often heard people say "entropy depends on the observer."
Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T ofCp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
 
  • #6
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,490
2,629
Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T ofCp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
Well, that definition of entropy is a little circular, because ##T## is in turn defined via ##1/T = \frac{\partial S}{\partial U}|_{V}##.
 
  • #7
Lord Jestocost
Gold Member
684
483
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
 
  • #8
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,490
2,629
Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
The question is: how is ##T## defined?

In statistical mechanics, entropy is the primary quantity, and temperature is defined in terms of how entropy changes when you add a small amount of energy.
 
  • #9
Lord Jestocost
Gold Member
684
483
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 +δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
 
  • #10
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,490
2,629
The question is: Does entropy depend on the observer?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S1 = S0 +δQrev/T (integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).
Point taken, but there's another issue even after you've chosen the macroscopic variables. Given macroscopic variables ##E, V, N## (total energy, volume and number of particles), there are many (infinitely many in the classical case, and astronomically many in the quantum case) microstates consistent with that macrostate. But are they all equally likely? If not, what's the probability distribution?

You can just define "equilibrium" so that equal-likelihood is part of the definition, I suppose. Then your claims about entropy are objectively true for a system in equilibrium.
 
  • #11
Lord Jestocost
Gold Member
684
483
I agree. Nonmechanical thermodynamic variables such as temperature and entropy are combined with “mechanical“ considerations from statistical mechanics on base of the concept of thermal equilibrium.
 

Related Threads on Entropy as a measure of ignorance

Replies
3
Views
494
  • Last Post
Replies
6
Views
882
  • Last Post
Replies
12
Views
3K
Replies
7
Views
918
  • Last Post
Replies
2
Views
590
Replies
3
Views
3K
  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
2
Views
4K
  • Last Post
Replies
4
Views
6K
  • Last Post
Replies
8
Views
3K
Top