B Is Entropy Truly Undefined in Physical Systems?

  • B
  • Thread starter Thread starter lukephysics
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
The discussion centers on the concept of entropy in relation to information theory and its implications for understanding the universe. It highlights the confusion surrounding the definition of entropy, particularly in the context of the early universe and chemical reactions, questioning the role of the observer in these definitions. The conversation references Planck's work on entropy as a Lorentz invariant scalar and discusses the treatment of thermodynamic quantities in the rest frame of a medium. Additionally, it emphasizes the statistical physics perspective, where entropy measures missing information relative to complete knowledge of a system. Overall, the dialogue seeks clarity on the various interpretations and applications of entropy across different scientific frameworks.
lukephysics
Messages
60
Reaction score
15
TL;DR Summary
why do they say things have entropy such as ‘the early universe has low entropy’ when they don’t specify who is the observer and what thing are they predicting?
I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction about something of particular interest to them.

So why do they say things have entropy such as ‘the early universe has low entropy’ when they don't say who is the observer and what thing are they predicting?

Another example is entropy in chemical reactions. Is that a different definition of entropy? Or is it fundamentally the same?
 
Last edited:
Physics news on Phys.org
Can you please say (exactly quote) the specific statements that you're examining?
 
Interesting question - maybe @vanhees71 has some insights here, being the resident expert on relativistic hydrodynamics. Planck proved in this paper that entropy is a Lorentz invariant scalar, but also regards the measured temperature as transforming as ##T \rightarrow T(1-v^2)^{-1/2}## between observers. On the other hand, it seems more natural to say that temperature is only defined in the rest frame of the body?
 
The relativistic treatment of thermodynamics before van Kampen is a mess, though I'm not sure whether van Kampen is really the first who introduced our modern view. A kind of review is

N. G. van Kampen, Relativistic thermodynamics of moving
systems, Phys. Rev. 173, 295 (1968),
https://doi.org/10.1103/PhysRev.173.295.

Today we use the definition, as given in Sect. 9 of this paper, that the thermodynamic quantities are defined in the (local) restframe of the medium. Entropy is a scalar quantity. The paper also gives examples for two historical treatments by Ott and Planck.

Another approach is of course statistical physics. There the key is that the phase-space distribution function is a scalar quantity. For a manifestly covariant treatment of elementary relativistic transport theory, see

https://itp.uni-frankfurt.de/~hees/publ/kolkata.pdf

Concerning the more general questions of the OP, it's clear that entropy in the information theoretical sense (and that seems to be the best and most comprehensive approach we have) is always a measure for the missing information (more intuitively, it's a measure for the "surprise" you have from a specific outcome of a random experiment), given some information about the system, relative to the case of complete information. E.g., in quantum statistical physics the entropy is always relative to the preparation of a pure state, i.e., any pure state has entropy 0, which leads to the von Neumann-Shannon-Jaynes definition of entropy,
$$S=-k_{\text{B}} \mathrm{Tr} (\hat{\rho} \ln \hat{\rho} ),$$
where ##\hat{\rho}## is the statistical operator, describing the state of the system.
 
Last edited:
  • Like
Likes ergospherical
Thread 'Gauss' law seems to imply instantaneous electric field'
Imagine a charged sphere at the origin connected through an open switch to a vertical grounded wire. We wish to find an expression for the horizontal component of the electric field at a distance ##\mathbf{r}## from the sphere as it discharges. By using the Lorenz gauge condition: $$\nabla \cdot \mathbf{A} + \frac{1}{c^2}\frac{\partial \phi}{\partial t}=0\tag{1}$$ we find the following retarded solutions to the Maxwell equations If we assume that...
Thread 'Gauss' law seems to imply instantaneous electric field (version 2)'
This argument is another version of my previous post. Imagine that we have two long vertical wires connected either side of a charged sphere. We connect the two wires to the charged sphere simultaneously so that it is discharged by equal and opposite currents. Using the Lorenz gauge ##\nabla\cdot\mathbf{A}+(1/c^2)\partial \phi/\partial t=0##, Maxwell's equations have the following retarded wave solutions in the scalar and vector potentials. Starting from Gauss's law $$\nabla \cdot...
Back
Top