What is the real second law of thermodynamics?

AI Thread Summary
The second law of thermodynamics states that the entropy of a closed system tends to increase over time, reflecting a tendency toward disorder. While classical interpretations suggest that entropy cannot decrease, discussions highlight the Poincaré Recurrence theorem, which implies that systems may return to a previous state given sufficient time, albeit over impractically long durations. The conversation also emphasizes the statistical nature of entropy, where fluctuations can lead to temporary decreases in entropy, though these events are exceedingly rare. Additionally, the role of quantum mechanics introduces complexities in understanding entropy, as it involves inherent uncertainties that contribute to information loss. Overall, the second law serves as a fundamental guideline for physical processes, with entropy being a key measure of system behavior.
  • #101


lalbatros said:
When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules to have a Maxwell-Boltzman distribution of energies. So it is not that entropy is irrelevant at the small scale. It is really that it is not defined at the small scale.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
No. If you made an experiment for which one only had to wait a billion years, there is a good chance we could observe it happening somewhere in the universe all the time. The second law is not being violated anywhere in the universe.
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
The second law will not be violated on any time scale with respect any collection of molecules for which a temperature can defined.

Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!
Do you mean if you witnessed an event that has a chance of one in 10^billion universe lifetimes of occurring that you would disprove the second law?

Suppose you thought you witnessed such an event, that heat flowed spontaneously from cold to hot all by itself for a brief instant. Since you could not repeat the result and no one else can repeat it because it never occurs again, what have you proven? You could never really determine whether your result was a mistake or a real observation. The chances are much better that it was a mistake than a real event.

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?
No. The temperature of matter - a collection of molecules - is defined by the Maxwell-Boltzmann distribution. One cannot just make up a definition of temperature. It is a real law based on science not an arbitrary definition.

AM
 
Science news on Phys.org
  • #102


Andrew Mason said:
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules
The Boltzmann equation shows that entropy does not require a temperature.

The deterministic version of thermodynamics is valid only for systems in local equilibrium, which requires a macroscopic system with enough molecules to reliably average over them.

Fluctuating thermodynamics (the version defined by statistical mechanics, which is valid for any size) is very little predictive for small systems, since for these it predicts fluctuations of a size that may erase any informative content in the mean values.
 
  • #103


Dear All,

I feel much more confortable to consider that the second law is an engineering heuristics.
And I even don't feel totally confident with this view!
Sorry for that!

To go back again to the Clausius formulation, I believe the full construct of thermodynamics can be derived from this statement. It can lead to the existence of an entropy state function and a thermodynamic temperature scale. But this is all a direct consequence from defining the notion of "hot" and "cold", a consequence of defining hot and cold as the direction for heat. Heat was already known before the second principle.
Where is there real physics in the Clausius statement?
Where is there real information on our physical world?
This Clausius statement is more like instructions for engineers to contruct thermodynamic tables from experiment in an organized way, and first of all it instructed engineers to built a consistent thermometer!
What could you do with the Clausius principle if you had no thermodynamic tables?
You reap the result of the Clausius statement only when a huge amount of experiments have been tabulated (recorded) in a rational way.
That's a huge achievement, but I don't really see any physical law in the Clausius statement.

For me the real physics comes with the statistical thermodynamics, with Boltzmann and Gibbs.
Thermodynamic tables can be calculated ab-initio thanks to their work.
And their work acknowledges clearly the existence of fluctuations.

Michel
 
Last edited:
  • #104


A. Neumaier said:
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept.

Spoken like a true mathematician! :)
 
  • #105


A. Neumaier said:
The Boltzmann equation shows that entropy does not require a temperature.
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM
 
  • #106


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

I think he's referring to the case of ground state degeneracy.
 
  • #107


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

The Boltzmann equation can, within its range of validity, predict the evolution of any distibution. The distribution doesn't need to be a Maxwellian and therefore, the temperature simply doesn't play any role in the Boltzmann equation, the H-function and the H-theorem. The same applies for other or all master equations in statistical mechanics, except if they are specialized to near-equilibrium solutions. The Maxwellian distribution comes into play as a special stationary solution, and for this special solution the temperature can be taken into consideration. The Boltzmann constant is introduced in the Boltzmann S-function for mere convenience. By doing so the results for (near) equilibrium distributions can be compared with thermodynamics.

This illustrates my point that the second law is more like an "engineering heuristics".
In addition, it also shows that the "http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description"" are only a very special case of a much broader "second law". The H-theorem applies to the thermalisation of particle distributions, which is not in the scope of the "second law" of thermodynamics as formulated by Clausius or Kelvin or Carathéodory.

Concerning the second law (of thermodynamics), it is the basis to construct the thermodynamic scale of temperature.
Having defined this scale, the recipe to built entropy tables is the famous law dS=dQ/T, where the temperature factor testifies of an assumption of equilibrium. Note that two of the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description" make no explicit reference to temperature.

In summary, there are many overlapping definitions of entropy!
 
Last edited by a moderator:
  • #108


lalbatros said:
The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.
 
  • #109


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.
 
  • #110


lalbatros said:
It just tells us how we play dice.

Life is a game, what else do you ask for besides a rational strategy? ;)

You can see a THEORY as a descriptive model, or, as an INTERACTION TOOL in an inference perspective.

A descriptive model is falsified or corroborated. Corroborated theories lives on. Falsified theories drop dead, leaving no clue as to how to improve.

An interaction tool for inference is different, it either adapts and learns, or it doesn't. Here the falsification corresponds to "failure to learn". The hosting inference system will be outcompeted by more clever competitor. This may be one reason to suspect that the laws of physics, we actually find in nature, does have inferencial status.

It's much more than playing dice.

/Fredrik
 
  • #111


To think that we can DESCRIBE the future, is IMHO a very irrational illusion.

All we have, are expectations of the future, based on the present (including present records of the past), and based upon this we have to throw our dice. There is no other way.

In this respect, the second law is one of the few "laws" that are cast in a proper inferencial form as is.

As anyone seriously suggest you say; understand Newtons law of gravity, but do not understand the second law? If one of them is mysterious I can't see how it's the second law.

/Fredrik
 
  • #112


Fra said:
To think that we can DESCRIBE the future, is IMHO a very irrational illusion.
Both future and past are illusions - real is only the present (if anything at all).

Nevertheless, it is often easier to predict the future than the past. The past of a stirred fluid coming slowly to rest is far more unpredictable than the final rest state.
 
  • #113


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

I simply don't see what could be obtained from a dimensional argument.
The thermodynamic dimension of entropy is purely conventional.

The factor is there as a connection between a measure of disorder and a measure of energy.
Nevertheless, disorder can be defined without any relation to energy.
The historical path to entropy doesn't imply that entropy requires any concept of thermodynamics.

The widespread use of entropy today has clearly shown that it is not a thermodynamic concept. We know also that entropy finds a wide range of application in thermodynamics.
It should be no surprise that the use of entropy in thermodynamics requires a conversion factor. This factor converts a measure of disorder to the width of a Maxwellian distribution.
 
  • #115


A. Neumaier said:
H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.

Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.

Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have godd discussions about this.
 
  • #117


Andy Resnick said:
That is true- but then what is the temperature of a sandpile?

http://rmp.aps.org/abstract/RMP/v64/i1/p321_1
http://pra.aps.org/abstract/PRA/v42/i4/p2467_1
http://pra.aps.org/abstract/PRA/v38/i1/p364_1

it's not a simple question to answer.

I have at least two reasons to enjoy this type of sandpile physics:

  1. I work for the cement industry, and there are a many granular materials there:
    limestone, chalk (made of Coccoliths!), sand, clay, slag, fly ashes, ground coal, rubber chips, plastic pellets, ...
  2. I enjoyed reading the book by Pierre-Gilles de Gennes on https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20
It is true that the (excited) avalanche phenomena near the critical repose angle has an analogy with a barrier crossing phenomena that can be associated to an equivalent temperature (fig 85.c in https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20). I guess this temperature might indeed represent the probability distribution of the grains energy acquired by the external excitation.

Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

If we consider that any information, in the end, needs a physical substrate to be stored, then effectively the whole world is mechanical and , in the end, any entropy could be related to an energy distribution.
As long as there are no degenerate states, of course ...
So the question about entropy and energy could be translated in:

How much information is stored in degenerate states compared to how much is stored on energy levels? (in our universe)

My guess goes for no degeneracy.
Meaning that history of physics was right on the point since Boltzman: it would make sense to give energy dimensions to entropy!
 
Last edited by a moderator:
  • #118


lalbatros said:
I have at least two reasons to enjoy this type of sandpile physics:

Glad to hear it- I find it interesting as well. I was involved with a few soft-matter groups when I worked at NASA.


lalbatros said:
Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

I think you still misunderstand me- what you say above is of course correct, but I think you missed my main point, which is that temperature and entropy should not be simply equated with energy. Entropy is energy/degree, and so there is an essential role for temperature in the entropy.

How about this example- laser light. Even though laser light has an exceedingly well-defined energy, it has *no* temperature:

http://arxiv.org/abs/cond-mat/0209043

They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

Again, laser light is a highly coherent state, is resistant to thermalization in spite of interacting with the environment, has a well defined energy and momentum, and yet has no clear entropy or temperature.
 
  • #119


Andy Resnick said:
They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

I think there should be no problem to define the entropy, even though the temperature might be totally undefined.

It is clear that entropy is not a function of energy in general.
Just consider the supperposition of two bell-shape distribution.
What is the "temperature" of this distribution?
Even when the two distributions are Maxwellians, you would still be forced to describe the global distribution by three numbers: two temperatures and the % of each distribution in the total.
This is a very common situation.
Very often there are several populations that do not thermalize even when reaching a steady state (open system).
For example the electron and ion temperatures are generally very different in a tokamak.
Even different ion species might have different distributions in a tokamak, specially heavy ions with respect to light ions.
There might even be two populations of electrons, not to mention even runaway electrons in extreme situations.
In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.
I agree with that with the temporary naive idea that this will be the case when at least approximately S=S(E) .
One can easily built articial examples.
For example, on could constrain a distribution to be Lorentzian instead of Maxwellian, or any suitable one-parameter distribution. Within this constraint S would be a function of E via the one parameter defining this distribution. Temperature should be defined in this situation.
I am curious to see a more physical example in the paper.
I am also curious to think about which "thermodynamic relations" would still hold and which should be removed, if any.

Thanks for the reference,

Michel
 
  • #120


Andy Resnick said:
Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.
Andy Resnick said:
Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have good discussions about this.
The level of nonequilibrium thermodynamics is characterized by local equilibrium (in position space). On this level, dynamics is governed by hydrodynamics, variants of the Navier-Stokes equations. Here temperature exists, being defined as the equilibiruim temperature of an (in the limit infinitesimal) cell. Or, formally, as the inverse of the quantity conjugate to energy.

A more detailed level is the kinetic level, characterized by microlocal equilibrium (in phase space). On this level, dynamics is governed by kinetic equations, variants of the Boltzmann equation. Entropy still exists, being defined even on the more detailed quantum level. Temperature does not exist on this level, but appears as an implicit parameter field in the hydrodynamic limit: The kinetic dynamics is approximated in a local equilibrium setting, by assuming that the local momentum distribution is Maxwellian. The temperature is a parameter in the Gaussian local momentum distribution, and makes no sense outside the Gaussian approximation.
 
  • #121


lalbatros said:
It is clear that entropy is not a function of energy in general.

<snip>

In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.

<snip>

Thanks for the reference,

Michel

My pleasure.

It's possible to recover cleanly defined thermodynamic properties in a nonequilibrium system in certain restricted cases: when the density matrix is block diagonal (if that's the correct term), for example. Conceptually, this is similar to coarse-graining or embedding a dissipative system in a higher-dimensional conservative system.

This only works for linear approximations- the memory of a system is very short (the relaxation time is short), or the Onsager reciprocal relations can be used.

As a more abstract example; we (our bodies) exist in a highly nonequilibrium state: the intracellular concentration of ATP is 10^10 times higher than equilibrium (Nicholls and Ferguson, "Bioenergetics"), which means the system can't be linearized and the above approximation scheme fails. How to assign a temperature? Clearly, there does not have to be a relationship between the "temperature" defined in terms of the distribution function of ATP and 98.6 F.
 
  • #122


A. Neumaier said:
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.

That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
 
  • #123


Andy Resnick said:
That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.
 
  • #124


A. Neumaier said:
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.

This scientist (who is occasionally called a Physicist) is familiar with the system of 'natural units'. No scientist I work with (Physicist or otherwise) would ever confuse a mathematical model with the actual physical system. My students often do, as evidenced by comments like "this data doesn't fit the model, so the data is bad".

Models can be simplified to better suit our limited understanding, at the cost of decreased fidelity to the phenomenon under investigation.
 
  • #125


My advisor (a physicist) always suggests for me to use natural units since it's just the general behavior were studying (it's a more theoretical treatment)

I'm not emotionally comfortable with that, but it makes sense for a journal like Chaos. It's an exploratory science, not experimental, more theoretical = more mathematical.

I see a spectrum, not a yes-no situation, but then I'm a dynamicist. My work often involves turning the step function into a hyperbolic tangent.
 
  • #126


Pythagorean said:
My advisor (a physicist) always suggests for me to use natural units [...] My work often involves turning the step function into a hyperbolic tangent.
and I guess if you'd use instead an arc tangent you'd use for the resulting angles natural units, too, and not degrees.

Angles in degrees and temperature in degrees are both historical accidents.
An extraterrestrial civilization will not have the same units - but their natural units will be the same.
 
  • #127


Yeah, I don't use natural units. I like to talk to my othe advisor (experimental biophysics) about specific values when I'm looking for biological motivation.
 
  • #128


lalbatros said:
The factor is there as a connection between a measure of disorder and a measure of energy.

I admit that I don't konw what the focus is in the discussion, but to understand a measure of disorder or information without the classical thermodynamical notions, one still needs a way to quantify data or evidence.

I'd say that what replaces the "energy" in the more general abstract discussion is amount of data, or sample size. Without a notion of complexity in the microstructure, or a means to COUNT microstates, any measure of disorder is ambigous.

One can relate shannon entropy to pure probabilistic settings where one explicitly calculate how the conditional probability (based on a given prior macroscate) of a certain distribution/macrostate, depends on it's shannon entropy. Here there appears naturally a scale factor in front of the shannon entropy in an e^MS_shannon term, where M is the complexity or about of data (lenght of event sequence).

So any redefinition of entropy by convention of scaling to units translates to a different relation between entropy and probability. But I think the interesting and central part is the probability anyway. The entropy is just a convenient way to transform one measure into another measure where the combination of independent systems gets additive instead of multiplicative.

So the relativity of entropy, is then just a transformed view of the subjective view on probability. The only objection I have to the use of entropy in physics is that the conditional nature of the measure is often forgotten and instead one resorts to somewhat diverged abstractions such as fictive ensembles, instead of just sticking to plain counting of evidence that would be the way to go in a proper inferencial view.

/Fredrik
 
  • #129


Andy Resnick said:
According to the fluctuation-dissipation theorem, it regularly does:

http://prl.aps.org/abstract/PRL/v89/i5/e050601
Actually that's the "fluctuation theorem." The fluctuation-dissipation theorem deals with system response to an external force or perturbation away from equilibrium.
 
  • #130


I'm not sure that's a meaningful distinction.
 
  • #131
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know. If I have a container of gas, it has a certain entropy. If somehow (staying classical here), I measure the position and momentum of a single particle as a function of time, then the rest of the gas is characterized by an entropy less than the entropy without the measurement, because of reduced number of particles. Yet the observed particle has no entropy. What happened to the difference in entropies? Its gone, because your knowledge of the system has increased beyond simply knowing the macro thermodynamic parameters.

Also, if you had a gas of 10-pound cannonballs, colliding elastically, without friction (or practically so), and the average distance between them was about a mile, and you had a volume the size of the solar system, I doubt if quantum mechanics would be needed to describe the thermodynamic behavior of this gas.
 
Last edited:
  • #132
Rap said:
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know.

The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.
 
  • #133
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same. You will never encounter an inconsistency or problem in any thermodynamic description of a process involving these gases as long as you cannot detect a difference between them. If you do run into an inconsistency, you will have found a way to distinguish them.

This was explained in a paper by Janes - Google "Jaynes" "The Gibbs Paradox"
 
  • #134
Rap said:
I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same.

No. there are two different entropies in the two cases.

More importantly, if the gases are different, they will _not_ behave the same way in any experiment that can distinguish the gases - independent of whether or not anyone knows it,

Indeed, precisely because of this fact one can actually learn from such an experiment that the gases are different, and thus correct one's ignorance. If there were no observable difference, we would never know - and there would be nothing to know since the alleged difference is in such a case only a figment of the imagination, not a scientific property.
 
  • #135
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

Well, one can certainly compute an entropy associated with the knowledge about a system:

S = k log(W)

where W is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.
 
  • #136
Andy Resnick said:
I'm not sure that's a meaningful distinction.

Are you talking about the distinction between the fluctuation theorem and the fluctuation-dissipation theorem? Those are two different theorems.

The fluctuation theorem is about statistical fluctuations of entropy.

The fluctuation-dissipation theorem is about the relationship between fluctuations in some state variable and a dissipative force acting on that variable. The paradigm example is Nyquist noise in an electric circuit. A resistor is a dissipative force. The corresponding fluctuation is in voltage: the voltage across a resistor will fluctuate in a way related to the resistance. Another example of the fluctuation-dissipation theorem is Brownian motion: the dissipation here is viscous drag on particles moving through a fluid. The corresponding fluctuation is Brownian motion.
 
  • #137
stevendaryl said:
Well, one can certainly compute an entropy associated with the knowledge about a system:

S = k log(W)

where W is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.

Sure, but this simply means that you get the correct entropy precisely when your knowledge is the physically correct one that describes everything there is to know about the system.

If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.
 
  • #138
A. Neumaier said:
So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.

Hmm. I'm guess I'm not completely convinced that there is such a thing as physical entropy. Suppose we're doing classical physics, and our system is just 5 particles bouncing around a box. Then there is no reason to bring up thermodynamic concepts of temperature and entropy. But if we expand that to 5000 or 50,000, or 50,000,000,000 particles, then the description of the system in terms of particles with definite positions and velocities just becomes completely unwieldy. So if we give up precise predictions, we can make approximate predictions by using thermodynamical quantities total energy, pressure, entropy, etc. But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.
 
  • #139
A. Neumaier said:
If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

Another point about what you said. You talk in terms of the system behaving according to my knowledge. I'm not sure what that means. Is the behavior that you're talking about the second law of thermodynamics? The way I see it is (in classical physics, anyway) is that if we knew exactly the positions and velocities of all the particles, and we had unlimited computational resources, we could predict the future states of the system without ever invoking thermodynamics. To me, thermodynamics comes into play when we only have gross measures of the state, and we want to make gross predictions about future behavior. So we're trying to do physics at a gross level, with macroscopic inputs and macroscopic outputs. Entropy is in some sense a measure of the neglected level of detail that we left out of the gross macroscopic description. It's hard for me to see how that is a physically objective quantity.
 
  • #140
It's the long argument between thermodynamic entropy and information entropy.

Think of the 5-particle case for example. There is the empirical definition of entropy, carried out by measurements, and there is the statistical mechanical (or information theoretic) explanation of that definition.

In the 5-particle case, how do you measure entropy (or more precisely an entropy change)? You make the container have one wall with some mass that moves and apply a constant external force to it. That gives it a "constant" pressure. You can now measure the volume, and it will be constantly changing, every time a particle hits the wall it gets kicked upward, otherwise, it accelerates downward. The volume is fluctuating around some average value. You know P, V, N, and k, and assuming the gas is ideal, you get the temperature T=PV/Nk. Because volume is fluctuating, the temperature is fluctuating too, about some average value.

Now you add an increment of energy to the system, without changing the pressure. (isochoric work). Like with a paddle wheel that kicks one of the particles just a little bit. Thats your measured dU. Since dV=0 on average, dU=T dS gives you dS, the change in entropy. Since T is fluctuating, dS will fluctuate too about an average. Not sure if that is exactly correct, but you get the idea.

Going to stat mech - the entropy is k ln W, where W is the number of microstates that could give rise to the specified P,V,T of the initial system and dS=k dW/W. This is information theory. For example, you could say S=k ln(2) Y where Y=log2(W) (log2 is log to base 2). Y is then the number of yes/no questions you have to ask to determine the microstate. (Actually the number of yes/no questions according to the "best" algorithm, in which each question splits the number of ways in half). This reminds me that the stat mech explanation is information theoretic.

Anyway, the two expressions will match, at least on average. If you increase your knowledge by following just one of the particles, you will have increased your knowledge of the system beyond that of just P,V,T. The statmech guy will say the number of questions has decreased, therefore the entropy has decreased. The fluctuations in T and V and dS will be able to be correlated somewhat with the fluctuations in the position and velocity of the observed particle. (assume classical for simplicity). The thermo guy will say no, this extra knowledge is "out of bounds" with respect to thermodynamics. So the statmech guy's definition and the thermo guy's definitions do not match, except when the statmech guy's definition stays "in bounds". If we stay "in bounds", then entropy is objective. But I have no problem wandering out of bounds, just to see what happens.

The whole problem of the extensivity of entropy and Boltzmann counting is solved by this. The thermodynamicist simply declares that drawing a distinction between like particles is out of bounds. The fact that quantum mechanics says this is true in principle in the quantum regime is really irrelevant. You can have the thermodynamics of a gas of elastically colliding cannonballs and declare distinguishing them out of bounds, and you're good to go.

Regarding the entropy of mixing, if you have two different particles, and the thermo guy declares that distinguishing their difference is out of bounds, and the statmech guy says that the knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy doesn't yet have the ability to distinguish, the the statmech guy says that any knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy can distinguish difference without going out of his pre-established bounds (i.e. by examining particles one at a time), then the knowledge guy says this knowledge is availiable, entropy is objective, and the theory works. If you have a gas of red and blue elastic cannonballs, and their color does not affect how they behave in a collision, and you accept that their color can be determined without appreciably affecting their velocity and momentum, then you can have a disagreement. The thermo guy will declare such measurements out of bounds, while the statmech guy will say that knowledge is availiable. The thermo guys theory will work, the statmech guy's theory will work, but they will make different calculations and different predictions.

The gas of cannonballs might be a gas of stars in a stellar cluster, and then its best to wear two hats.
 
Last edited:
  • #141
stevendaryl said:
But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.

The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.
 
  • #142
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.
 
  • #143
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.

Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.
 
  • #144
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Circular or not, it is objective, and thus entropy is also objective.

Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.
 
  • #145
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.



In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.

We were originally talking about a gallon of raw oil, for which I made my assertion. Certainly temperature is well-defined there for scientific use.

With 45 molecules, you only have a nanosystem, which behaves quite differently form a system in equilibrium. The concept of temperature is there hardly applicable - at least not in the conventional sense.

The equation of state is a concept valid only in the thermodynamic limit, i.e., when the number of molecules is huge.
 
  • #146
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Temperature is not defined in terms of entropy, it is (usually) defined by the Carnot cycle, which makes no explicit reference to entropy.

A. Neumaier said:
Circular or not, it is objective, and thus entropy is also objective.

Circular definitions are not definitions at all. Entropy, as you define it may be objective, but its definition cannot be circular.

A. Neumaier said:
Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.

Yes. Statistical mechanics explains thermodynamic temperature and entropy, but does not define it.
 
  • #147
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

If you think it can be kept in equilibrium, describe a process that does it!

In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.
 
  • #148
A. Neumaier said:
If you think it can be kept in equilibrium, describe a process that does it!

Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.


A. Neumaier said:
In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Yes, point taken.

A. Neumaier said:
Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.

Maybe we have different definitions of circular? Can you show how the definition of a group is circular?
 
  • #149
Regarding the definition of the entropy., I think that thermodynamic temperature ##T## and thermodynamic entropy ##S## are introduced both at the same time - one is not derived from each other.

Consider simple uniform one-component system. If U is a function of two variables, ##V, p## there always exist integration factor T(p, V) which makes the expression

$$
dQ = dU + pdV
$$

into total differential of certain function ##S(p,V)##:


$$
\frac{dU}{T} + \frac{p}{T} dV = dS.
$$

The function ##T(p,V)## can be chosen in such a way that it has value 273,16 for triple point of water and 0 for the lowest temperature. Once that is done, the changes in temperature and entropy are definite and depend only on the changes of the macroscopic variables ##p, V##.

The entropy has additional freedom in that any constant can be added to it, for example we can choose the entropy function so that it has value zero at pressure 1 atm and volume 1 liter. Once the value of entropy for one state is fixed, both temperature and entropy are definite functions of state. They do not depend on knowledge of a human.

There is another concept of entropy - information entropy, or Gibbs - Jaynes entropy, let's denote it by I. This is a different thing; it is not a function of macroscopic quantities, but a function of a set of probabilities

$$
I(p_k) = - \sum_k p_k ln p_k,
$$

It so happens that in statistical physics, the thermodynamic entropy is often calculated as the maximum value of I given the macroscopic quantities as constraints on the probabilities ##p_k##. But this does not mean that thermodynamic entropy is the same thing as information entropy.

One could use the information entropy in many diverse situations, even for 45 particle systems and also outside the realm of thermodynamics. Since probabilities ##p_k## are often dependent on what one knows about the system, the value of I is not a measurable physical quantity like volume, but a rather auxiliary concept. I think this was the entropy stevendaryl was talking about, so there is no disagreement with what Arnold has said.
 
  • #150
Rap said:
Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.

What is here the meaning of equilibrium? Why can you assume that after a long time equilibrium will occur?
This is all ill-defined.


Rap said:
Maybe we have different definitions of circular? Can you show how the definition of a group is circular?

The definition of a group is noncircular in terms of ZFC, say. But the definition of ZFC is circular (as it needs a meta-ZFC or so to
formulate its definition.)

However, the operations product and inverse are circularly defined within a group, and this is what I had meant. Indeed, what is conventionally regarded as circular definitions are in fact just consistency requirements for an object that ''has'' the defined concepts in the same way as a grou ''has'' a product and an inverse. Thus there is no logical problem in circular definitions. Circularity is a problem only in logical arguments, as without an independent argument establishing one of the claims the reasoning is invalid.
 
Back
Top