Monsterboy
- 305
- 96
Last edited:
Entropy is defined as a measure of ignorance about a system, particularly in the context of thermodynamics and statistical mechanics. The discussion references Boltzmann's entropy formula, S = k log(W), where W represents the volume in phase space, and emphasizes the subjective nature of entropy based on the observer's knowledge. Key insights include the argument for using "spreading" as a descriptor for entropy, which encompasses spatial, temporal, and energetic dimensions. The conversation also highlights the relationship between entropy and energy dispersion, as articulated by Frank L. Lambert.
PREREQUISITESPhysicists, chemists, and students of thermodynamics and statistical mechanics seeking to deepen their understanding of entropy and its implications in energy systems.
Monsterboy said:I have often heard people say "entropy depends on the observer."
Lord Jestocost said:Why should entropy depend on the observer?
"The entropy of a substance, its entropy change from 0 K to any T, is a measure of the energy that can be dispersed within the substance at T: integration from 0 K to T of ∫Cp/T dT (+ q/T for any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)
Lord Jestocost said:Entropy is linked to energy through its original definition by Clausius, dS = dQ/T, where "d" connotes a very small change.
Lord Jestocost said:The question is: Does entropy depend on the observer?
When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:
S1 = S0 + ∫δQrev/T (integration from 0 to 1)
The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).