What is the real second law of thermodynamics?

Click For Summary

Discussion Overview

The discussion revolves around the interpretation and implications of the second law of thermodynamics, exploring various statements, theories, and thought experiments related to entropy in closed systems. Participants engage with both classical and statistical thermodynamics, considering the nature of entropy, its increase, and potential decreases under specific conditions.

Discussion Character

  • Exploratory
  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Some participants assert that the second law states that the entropy of a closed system will increase over time.
  • Others propose that while entropy generally increases, there are scenarios, such as the Poincare Recurrence relation, where entropy could decrease, given sufficient time.
  • A participant suggests that the second law should be viewed as an expectation rather than a strict rule, emphasizing the probabilistic nature of entropy changes.
  • Concerns are raised about the relationship between classical thermodynamics and statistical thermodynamics, with some noting that the second law is a statistical statement rather than an absolute one.
  • One participant discusses the implications of boundary conditions on differential equations and their relation to the Boltzmann distribution and entropy, expressing confusion over the existence of asymptotic solutions.
  • Another participant mentions the fluctuation-dissipation theorem, suggesting that decreases in entropy can occur, albeit with low probability.
  • There is a discussion about the role of different entropy measures in quantum mechanics, highlighting the complexity of entropy definitions across various frameworks.

Areas of Agreement / Disagreement

Participants express a range of views on the nature of the second law of thermodynamics, with no clear consensus reached. While some agree on the general principle of increasing entropy, others challenge this notion by introducing scenarios where entropy might decrease or discussing the probabilistic interpretation of the law.

Contextual Notes

The discussion includes references to specific theorems and principles, such as the Poincare Recurrence relation and the fluctuation-dissipation theorem, which introduce additional complexity and assumptions that are not universally accepted or understood among participants.

  • #121


lalbatros said:
It is clear that entropy is not a function of energy in general.

<snip>

In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.

<snip>

Thanks for the reference,

Michel

My pleasure.

It's possible to recover cleanly defined thermodynamic properties in a nonequilibrium system in certain restricted cases: when the density matrix is block diagonal (if that's the correct term), for example. Conceptually, this is similar to coarse-graining or embedding a dissipative system in a higher-dimensional conservative system.

This only works for linear approximations- the memory of a system is very short (the relaxation time is short), or the Onsager reciprocal relations can be used.

As a more abstract example; we (our bodies) exist in a highly nonequilibrium state: the intracellular concentration of ATP is 10^10 times higher than equilibrium (Nicholls and Ferguson, "Bioenergetics"), which means the system can't be linearized and the above approximation scheme fails. How to assign a temperature? Clearly, there does not have to be a relationship between the "temperature" defined in terms of the distribution function of ATP and 98.6 F.
 
Science news on Phys.org
  • #122


A. Neumaier said:
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.

That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
 
  • #123


Andy Resnick said:
That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.
 
  • #124


A. Neumaier said:
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.

This scientist (who is occasionally called a Physicist) is familiar with the system of 'natural units'. No scientist I work with (Physicist or otherwise) would ever confuse a mathematical model with the actual physical system. My students often do, as evidenced by comments like "this data doesn't fit the model, so the data is bad".

Models can be simplified to better suit our limited understanding, at the cost of decreased fidelity to the phenomenon under investigation.
 
  • #125


My advisor (a physicist) always suggests for me to use natural units since it's just the general behavior were studying (it's a more theoretical treatment)

I'm not emotionally comfortable with that, but it makes sense for a journal like Chaos. It's an exploratory science, not experimental, more theoretical = more mathematical.

I see a spectrum, not a yes-no situation, but then I'm a dynamicist. My work often involves turning the step function into a hyperbolic tangent.
 
  • #126


Pythagorean said:
My advisor (a physicist) always suggests for me to use natural units [...] My work often involves turning the step function into a hyperbolic tangent.
and I guess if you'd use instead an arc tangent you'd use for the resulting angles natural units, too, and not degrees.

Angles in degrees and temperature in degrees are both historical accidents.
An extraterrestrial civilization will not have the same units - but their natural units will be the same.
 
  • #127


Yeah, I don't use natural units. I like to talk to my othe advisor (experimental biophysics) about specific values when I'm looking for biological motivation.
 
  • #128


lalbatros said:
The factor is there as a connection between a measure of disorder and a measure of energy.

I admit that I don't konw what the focus is in the discussion, but to understand a measure of disorder or information without the classical thermodynamical notions, one still needs a way to quantify data or evidence.

I'd say that what replaces the "energy" in the more general abstract discussion is amount of data, or sample size. Without a notion of complexity in the microstructure, or a means to COUNT microstates, any measure of disorder is ambigous.

One can relate shannon entropy to pure probabilistic settings where one explicitly calculate how the conditional probability (based on a given prior macroscate) of a certain distribution/macrostate, depends on it's shannon entropy. Here there appears naturally a scale factor in front of the shannon entropy in an e^MS_shannon term, where M is the complexity or about of data (lenght of event sequence).

So any redefinition of entropy by convention of scaling to units translates to a different relation between entropy and probability. But I think the interesting and central part is the probability anyway. The entropy is just a convenient way to transform one measure into another measure where the combination of independent systems gets additive instead of multiplicative.

So the relativity of entropy, is then just a transformed view of the subjective view on probability. The only objection I have to the use of entropy in physics is that the conditional nature of the measure is often forgotten and instead one resorts to somewhat diverged abstractions such as fictive ensembles, instead of just sticking to plain counting of evidence that would be the way to go in a proper inferencial view.

/Fredrik
 
  • #129


Andy Resnick said:
According to the fluctuation-dissipation theorem, it regularly does:

http://prl.aps.org/abstract/PRL/v89/i5/e050601
Actually that's the "fluctuation theorem." The fluctuation-dissipation theorem deals with system response to an external force or perturbation away from equilibrium.
 
  • #130


I'm not sure that's a meaningful distinction.
 
  • #131
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know. If I have a container of gas, it has a certain entropy. If somehow (staying classical here), I measure the position and momentum of a single particle as a function of time, then the rest of the gas is characterized by an entropy less than the entropy without the measurement, because of reduced number of particles. Yet the observed particle has no entropy. What happened to the difference in entropies? Its gone, because your knowledge of the system has increased beyond simply knowing the macro thermodynamic parameters.

Also, if you had a gas of 10-pound cannonballs, colliding elastically, without friction (or practically so), and the average distance between them was about a mile, and you had a volume the size of the solar system, I doubt if quantum mechanics would be needed to describe the thermodynamic behavior of this gas.
 
Last edited:
  • #132
Rap said:
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know.

The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.
 
  • #133
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same. You will never encounter an inconsistency or problem in any thermodynamic description of a process involving these gases as long as you cannot detect a difference between them. If you do run into an inconsistency, you will have found a way to distinguish them.

This was explained in a paper by Janes - Google "Jaynes" "The Gibbs Paradox"
 
  • #134
Rap said:
I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same.

No. there are two different entropies in the two cases.

More importantly, if the gases are different, they will _not_ behave the same way in any experiment that can distinguish the gases - independent of whether or not anyone knows it,

Indeed, precisely because of this fact one can actually learn from such an experiment that the gases are different, and thus correct one's ignorance. If there were no observable difference, we would never know - and there would be nothing to know since the alleged difference is in such a case only a figment of the imagination, not a scientific property.
 
  • #135
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

Well, one can certainly compute an entropy associated with the knowledge about a system:

S = k log(W)

where W is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.
 
  • #136
Andy Resnick said:
I'm not sure that's a meaningful distinction.

Are you talking about the distinction between the fluctuation theorem and the fluctuation-dissipation theorem? Those are two different theorems.

The fluctuation theorem is about statistical fluctuations of entropy.

The fluctuation-dissipation theorem is about the relationship between fluctuations in some state variable and a dissipative force acting on that variable. The paradigm example is Nyquist noise in an electric circuit. A resistor is a dissipative force. The corresponding fluctuation is in voltage: the voltage across a resistor will fluctuate in a way related to the resistance. Another example of the fluctuation-dissipation theorem is Brownian motion: the dissipation here is viscous drag on particles moving through a fluid. The corresponding fluctuation is Brownian motion.
 
  • #137
stevendaryl said:
Well, one can certainly compute an entropy associated with the knowledge about a system:

S = k log(W)

where W is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.

Sure, but this simply means that you get the correct entropy precisely when your knowledge is the physically correct one that describes everything there is to know about the system.

If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.
 
  • #138
A. Neumaier said:
So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.

Hmm. I'm guess I'm not completely convinced that there is such a thing as physical entropy. Suppose we're doing classical physics, and our system is just 5 particles bouncing around a box. Then there is no reason to bring up thermodynamic concepts of temperature and entropy. But if we expand that to 5000 or 50,000, or 50,000,000,000 particles, then the description of the system in terms of particles with definite positions and velocities just becomes completely unwieldy. So if we give up precise predictions, we can make approximate predictions by using thermodynamical quantities total energy, pressure, entropy, etc. But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.
 
  • #139
A. Neumaier said:
If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

Another point about what you said. You talk in terms of the system behaving according to my knowledge. I'm not sure what that means. Is the behavior that you're talking about the second law of thermodynamics? The way I see it is (in classical physics, anyway) is that if we knew exactly the positions and velocities of all the particles, and we had unlimited computational resources, we could predict the future states of the system without ever invoking thermodynamics. To me, thermodynamics comes into play when we only have gross measures of the state, and we want to make gross predictions about future behavior. So we're trying to do physics at a gross level, with macroscopic inputs and macroscopic outputs. Entropy is in some sense a measure of the neglected level of detail that we left out of the gross macroscopic description. It's hard for me to see how that is a physically objective quantity.
 
  • #140
It's the long argument between thermodynamic entropy and information entropy.

Think of the 5-particle case for example. There is the empirical definition of entropy, carried out by measurements, and there is the statistical mechanical (or information theoretic) explanation of that definition.

In the 5-particle case, how do you measure entropy (or more precisely an entropy change)? You make the container have one wall with some mass that moves and apply a constant external force to it. That gives it a "constant" pressure. You can now measure the volume, and it will be constantly changing, every time a particle hits the wall it gets kicked upward, otherwise, it accelerates downward. The volume is fluctuating around some average value. You know P, V, N, and k, and assuming the gas is ideal, you get the temperature T=PV/Nk. Because volume is fluctuating, the temperature is fluctuating too, about some average value.

Now you add an increment of energy to the system, without changing the pressure. (isochoric work). Like with a paddle wheel that kicks one of the particles just a little bit. Thats your measured dU. Since dV=0 on average, dU=T dS gives you dS, the change in entropy. Since T is fluctuating, dS will fluctuate too about an average. Not sure if that is exactly correct, but you get the idea.

Going to stat mech - the entropy is k ln W, where W is the number of microstates that could give rise to the specified P,V,T of the initial system and dS=k dW/W. This is information theory. For example, you could say S=k ln(2) Y where Y=log2(W) (log2 is log to base 2). Y is then the number of yes/no questions you have to ask to determine the microstate. (Actually the number of yes/no questions according to the "best" algorithm, in which each question splits the number of ways in half). This reminds me that the stat mech explanation is information theoretic.

Anyway, the two expressions will match, at least on average. If you increase your knowledge by following just one of the particles, you will have increased your knowledge of the system beyond that of just P,V,T. The statmech guy will say the number of questions has decreased, therefore the entropy has decreased. The fluctuations in T and V and dS will be able to be correlated somewhat with the fluctuations in the position and velocity of the observed particle. (assume classical for simplicity). The thermo guy will say no, this extra knowledge is "out of bounds" with respect to thermodynamics. So the statmech guy's definition and the thermo guy's definitions do not match, except when the statmech guy's definition stays "in bounds". If we stay "in bounds", then entropy is objective. But I have no problem wandering out of bounds, just to see what happens.

The whole problem of the extensivity of entropy and Boltzmann counting is solved by this. The thermodynamicist simply declares that drawing a distinction between like particles is out of bounds. The fact that quantum mechanics says this is true in principle in the quantum regime is really irrelevant. You can have the thermodynamics of a gas of elastically colliding cannonballs and declare distinguishing them out of bounds, and you're good to go.

Regarding the entropy of mixing, if you have two different particles, and the thermo guy declares that distinguishing their difference is out of bounds, and the statmech guy says that the knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy doesn't yet have the ability to distinguish, the the statmech guy says that any knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy can distinguish difference without going out of his pre-established bounds (i.e. by examining particles one at a time), then the knowledge guy says this knowledge is availiable, entropy is objective, and the theory works. If you have a gas of red and blue elastic cannonballs, and their color does not affect how they behave in a collision, and you accept that their color can be determined without appreciably affecting their velocity and momentum, then you can have a disagreement. The thermo guy will declare such measurements out of bounds, while the statmech guy will say that knowledge is availiable. The thermo guys theory will work, the statmech guy's theory will work, but they will make different calculations and different predictions.

The gas of cannonballs might be a gas of stars in a stellar cluster, and then its best to wear two hats.
 
Last edited:
  • #141
stevendaryl said:
But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.

The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.
 
  • #142
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.
 
  • #143
A. Neumaier said:
The entropy of a gallon of fluid can be calculated from the Helmholtz free energy A as S=-dA/dT, and A can be determined objectively by measurements. This is routinely and reliably done in engineering applications.

So what should not be objective about it?

The entropy of a tiny system of 45 molecules is not so well-defined as the tiny system cannot be kept in equilibrium, and entropy is (in the textbook framework) an equilibiruim property.

Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.
 
  • #144
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Circular or not, it is objective, and thus entropy is also objective.

Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.
 
  • #145
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.



In large systems, (the order of Avogadro's number of particles or larger) its ok to assume you have a thermometer which is small with respect to the system (so it does not appreciably affect it) yet large enough to have negligible fluctuations. For a system of 45 particles, using a mercury thermometer, you don't measure the temperature, you set it to the temperature of the thermometer. I think it is better to use the system itself as a thermometer by allowing it to vary its volume under fixed pressure, both of which are measureable, in principle. If you know the equation of state (e.g. ideal gas PV=NkT) then you can calculate the temperature. That's what I did in the perhaps too-long post above.

We were originally talking about a gallon of raw oil, for which I made my assertion. Certainly temperature is well-defined there for scientific use.

With 45 molecules, you only have a nanosystem, which behaves quite differently form a system in equilibrium. The concept of temperature is there hardly applicable - at least not in the conventional sense.

The equation of state is a concept valid only in the thermodynamic limit, i.e., when the number of molecules is huge.
 
  • #146
stevendaryl said:
Isn't that a little circular? Isn't temperature defined in terms of entropy? At least, it is in terms of statistical mechanics. I suppose you could just give an operational definition of temperature, in terms of whatever a thermometer reads.

Temperature is not defined in terms of entropy, it is (usually) defined by the Carnot cycle, which makes no explicit reference to entropy.

A. Neumaier said:
Circular or not, it is objective, and thus entropy is also objective.

Circular definitions are not definitions at all. Entropy, as you define it may be objective, but its definition cannot be circular.

A. Neumaier said:
Moreover, the concept of temperature and entropy were known long before the advent of statistical mechanics. Don't mistake modern presentations for the only way to set up things.

Yes. Statistical mechanics explains thermodynamic temperature and entropy, but does not define it.
 
  • #147
Rap said:
Why, in principle, can't a system of 45 particles be kept in equilibrium? Also, entropy cannot be defined only for equilibrium, or else there would be no use of the concept of entropy in the study of irreversible processes, in which entropy creation rates are calculated.

If you think it can be kept in equilibrium, describe a process that does it!

In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.
 
  • #148
A. Neumaier said:
If you think it can be kept in equilibrium, describe a process that does it!

Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.


A. Neumaier said:
In the thermodynamics of irreversible processes, one assumes local equilibrium, i.e., equilibrium in mesoscopic cells (with many more than 45 particles).

Yes, point taken.

A. Neumaier said:
Re a comment in another post: Basic definitions are always circular, such as the definition of a group. You assume certain concepts and you describe their relations, but not what they are. It is the same in thermodynamics. It would be impossible to define anything if it were otherwise.

Maybe we have different definitions of circular? Can you show how the definition of a group is circular?
 
  • #149
Regarding the definition of the entropy., I think that thermodynamic temperature ##T## and thermodynamic entropy ##S## are introduced both at the same time - one is not derived from each other.

Consider simple uniform one-component system. If U is a function of two variables, ##V, p## there always exist integration factor T(p, V) which makes the expression

$$
dQ = dU + pdV
$$

into total differential of certain function ##S(p,V)##:


$$
\frac{dU}{T} + \frac{p}{T} dV = dS.
$$

The function ##T(p,V)## can be chosen in such a way that it has value 273,16 for triple point of water and 0 for the lowest temperature. Once that is done, the changes in temperature and entropy are definite and depend only on the changes of the macroscopic variables ##p, V##.

The entropy has additional freedom in that any constant can be added to it, for example we can choose the entropy function so that it has value zero at pressure 1 atm and volume 1 liter. Once the value of entropy for one state is fixed, both temperature and entropy are definite functions of state. They do not depend on knowledge of a human.

There is another concept of entropy - information entropy, or Gibbs - Jaynes entropy, let's denote it by I. This is a different thing; it is not a function of macroscopic quantities, but a function of a set of probabilities

$$
I(p_k) = - \sum_k p_k ln p_k,
$$

It so happens that in statistical physics, the thermodynamic entropy is often calculated as the maximum value of I given the macroscopic quantities as constraints on the probabilities ##p_k##. But this does not mean that thermodynamic entropy is the same thing as information entropy.

One could use the information entropy in many diverse situations, even for 45 particle systems and also outside the realm of thermodynamics. Since probabilities ##p_k## are often dependent on what one knows about the system, the value of I is not a measurable physical quantity like volume, but a rather auxiliary concept. I think this was the entropy stevendaryl was talking about, so there is no disagreement with what Arnold has said.
 
  • #150
Rap said:
Wouldn't a cavity in a block of steel containing 45 atoms of argon qualify? Wait a long time till equilibrium occurs, thermally insulate the block of steel, etc.

What is here the meaning of equilibrium? Why can you assume that after a long time equilibrium will occur?
This is all ill-defined.


Rap said:
Maybe we have different definitions of circular? Can you show how the definition of a group is circular?

The definition of a group is noncircular in terms of ZFC, say. But the definition of ZFC is circular (as it needs a meta-ZFC or so to
formulate its definition.)

However, the operations product and inverse are circularly defined within a group, and this is what I had meant. Indeed, what is conventionally regarded as circular definitions are in fact just consistency requirements for an object that ''has'' the defined concepts in the same way as a grou ''has'' a product and an inverse. Thus there is no logical problem in circular definitions. Circularity is a problem only in logical arguments, as without an independent argument establishing one of the claims the reasoning is invalid.
 

Similar threads

  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
9K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K