B Is Entropy Truly Undefined in Physical Systems?

  • B
  • Thread starter Thread starter lukephysics
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
The discussion centers on the concept of entropy in relation to information theory and its implications for understanding the universe. It highlights the confusion surrounding the definition of entropy, particularly in the context of the early universe and chemical reactions, questioning the role of the observer in these definitions. The conversation references Planck's work on entropy as a Lorentz invariant scalar and discusses the treatment of thermodynamic quantities in the rest frame of a medium. Additionally, it emphasizes the statistical physics perspective, where entropy measures missing information relative to complete knowledge of a system. Overall, the dialogue seeks clarity on the various interpretations and applications of entropy across different scientific frameworks.
lukephysics
Messages
60
Reaction score
15
TL;DR Summary
why do they say things have entropy such as ‘the early universe has low entropy’ when they don’t specify who is the observer and what thing are they predicting?
I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction about something of particular interest to them.

So why do they say things have entropy such as ‘the early universe has low entropy’ when they don't say who is the observer and what thing are they predicting?

Another example is entropy in chemical reactions. Is that a different definition of entropy? Or is it fundamentally the same?
 
Last edited:
Physics news on Phys.org
Can you please say (exactly quote) the specific statements that you're examining?
 
Interesting question - maybe @vanhees71 has some insights here, being the resident expert on relativistic hydrodynamics. Planck proved in this paper that entropy is a Lorentz invariant scalar, but also regards the measured temperature as transforming as ##T \rightarrow T(1-v^2)^{-1/2}## between observers. On the other hand, it seems more natural to say that temperature is only defined in the rest frame of the body?
 
The relativistic treatment of thermodynamics before van Kampen is a mess, though I'm not sure whether van Kampen is really the first who introduced our modern view. A kind of review is

N. G. van Kampen, Relativistic thermodynamics of moving
systems, Phys. Rev. 173, 295 (1968),
https://doi.org/10.1103/PhysRev.173.295.

Today we use the definition, as given in Sect. 9 of this paper, that the thermodynamic quantities are defined in the (local) restframe of the medium. Entropy is a scalar quantity. The paper also gives examples for two historical treatments by Ott and Planck.

Another approach is of course statistical physics. There the key is that the phase-space distribution function is a scalar quantity. For a manifestly covariant treatment of elementary relativistic transport theory, see

https://itp.uni-frankfurt.de/~hees/publ/kolkata.pdf

Concerning the more general questions of the OP, it's clear that entropy in the information theoretical sense (and that seems to be the best and most comprehensive approach we have) is always a measure for the missing information (more intuitively, it's a measure for the "surprise" you have from a specific outcome of a random experiment), given some information about the system, relative to the case of complete information. E.g., in quantum statistical physics the entropy is always relative to the preparation of a pure state, i.e., any pure state has entropy 0, which leads to the von Neumann-Shannon-Jaynes definition of entropy,
$$S=-k_{\text{B}} \mathrm{Tr} (\hat{\rho} \ln \hat{\rho} ),$$
where ##\hat{\rho}## is the statistical operator, describing the state of the system.
 
Last edited:
  • Like
Likes ergospherical
Consider an extremely long and perfectly calibrated scale. A car with a mass of 1000 kg is placed on it, and the scale registers this weight accurately. Now, suppose the car begins to move, reaching very high speeds. Neglecting air resistance and rolling friction, if the car attains, for example, a velocity of 500 km/h, will the scale still indicate a weight corresponding to 1000 kg, or will the measured value decrease as a result of the motion? In a second scenario, imagine a person with a...
Scalar and vector potentials in Coulomb gauge Assume Coulomb gauge so that $$\nabla \cdot \mathbf{A}=0.\tag{1}$$ The scalar potential ##\phi## is described by Poisson's equation $$\nabla^2 \phi = -\frac{\rho}{\varepsilon_0}\tag{2}$$ which has the instantaneous general solution given by $$\phi(\mathbf{r},t)=\frac{1}{4\pi\varepsilon_0}\int \frac{\rho(\mathbf{r}',t)}{|\mathbf{r}-\mathbf{r}'|}d^3r'.\tag{3}$$ In Coulomb gauge the vector potential ##\mathbf{A}## is given by...
Thread 'Griffith, Electrodynamics, 4th Edition, Example 4.8. (First part)'
I am reading the Griffith, Electrodynamics book, 4th edition, Example 4.8 and stuck at some statements. It's little bit confused. > Example 4.8. Suppose the entire region below the plane ##z=0## in Fig. 4.28 is filled with uniform linear dielectric material of susceptibility ##\chi_e##. Calculate the force on a point charge ##q## situated a distance ##d## above the origin. Solution : The surface bound charge on the ##xy## plane is of opposite sign to ##q##, so the force will be...
Back
Top