What is the real second law of thermodynamics?

In summary: assuming you're asking about the entropy of a system in equilibrium, then it is absolutely true that the entropy will increase over time.
  • #106


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

I think he's referring to the case of ground state degeneracy.
 
Science news on Phys.org
  • #107


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

The Boltzmann equation can, within its range of validity, predict the evolution of any distibution. The distribution doesn't need to be a Maxwellian and therefore, the temperature simply doesn't play any role in the Boltzmann equation, the H-function and the H-theorem. The same applies for other or all master equations in statistical mechanics, except if they are specialized to near-equilibrium solutions. The Maxwellian distribution comes into play as a special stationary solution, and for this special solution the temperature can be taken into consideration. The Boltzmann constant is introduced in the Boltzmann S-function for mere convenience. By doing so the results for (near) equilibrium distributions can be compared with thermodynamics.

This illustrates my point that the second law is more like an "engineering heuristics".
In addition, it also shows that the "http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description"" are only a very special case of a much broader "second law". The H-theorem applies to the thermalisation of particle distributions, which is not in the scope of the "second law" of thermodynamics as formulated by Clausius or Kelvin or Carathéodory.

Concerning the second law (of thermodynamics), it is the basis to construct the thermodynamic scale of temperature.
Having defined this scale, the recipe to built entropy tables is the famous law dS=dQ/T, where the temperature factor testifies of an assumption of equilibrium. Note that two of the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description" make no explicit reference to temperature.

In summary, there are many overlapping definitions of entropy!
 
Last edited by a moderator:
  • #108


lalbatros said:
The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.
 
  • #109


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.
 
  • #110


lalbatros said:
It just tells us how we play dice.

Life is a game, what else do you ask for besides a rational strategy? ;)

You can see a THEORY as a descriptive model, or, as an INTERACTION TOOL in an inference perspective.

A descriptive model is falsified or corroborated. Corroborated theories lives on. Falsified theories drop dead, leaving no clue as to how to improve.

An interaction tool for inference is different, it either adapts and learns, or it doesn't. Here the falsification corresponds to "failure to learn". The hosting inference system will be outcompeted by more clever competitor. This may be one reason to suspect that the laws of physics, we actually find in nature, does have inferencial status.

It's much more than playing dice.

/Fredrik
 
  • #111


To think that we can DESCRIBE the future, is IMHO a very irrational illusion.

All we have, are expectations of the future, based on the present (including present records of the past), and based upon this we have to throw our dice. There is no other way.

In this respect, the second law is one of the few "laws" that are cast in a proper inferencial form as is.

As anyone seriously suggest you say; understand Newtons law of gravity, but do not understand the second law? If one of them is mysterious I can't see how it's the second law.

/Fredrik
 
  • #112


Fra said:
To think that we can DESCRIBE the future, is IMHO a very irrational illusion.
Both future and past are illusions - real is only the present (if anything at all).

Nevertheless, it is often easier to predict the future than the past. The past of a stirred fluid coming slowly to rest is far more unpredictable than the final rest state.
 
  • #113


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

I simply don't see what could be obtained from a dimensional argument.
The thermodynamic dimension of entropy is purely conventional.

The factor is there as a connection between a measure of disorder and a measure of energy.
Nevertheless, disorder can be defined without any relation to energy.
The historical path to entropy doesn't imply that entropy requires any concept of thermodynamics.

The widespread use of entropy today has clearly shown that it is not a thermodynamic concept. We know also that entropy finds a wide range of application in thermodynamics.
It should be no surprise that the use of entropy in thermodynamics requires a conversion factor. This factor converts a measure of disorder to the width of a Maxwellian distribution.
 
  • #115


A. Neumaier said:
H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.

Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.

Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have godd discussions about this.
 
  • #117


Andy Resnick said:
That is true- but then what is the temperature of a sandpile?

http://rmp.aps.org/abstract/RMP/v64/i1/p321_1
http://pra.aps.org/abstract/PRA/v42/i4/p2467_1
http://pra.aps.org/abstract/PRA/v38/i1/p364_1

it's not a simple question to answer.

I have at least two reasons to enjoy this type of sandpile physics:

  1. I work for the cement industry, and there are a many granular materials there:
    limestone, chalk (made of Coccoliths!), sand, clay, slag, fly ashes, ground coal, rubber chips, plastic pellets, ...
  2. I enjoyed reading the book by Pierre-Gilles de Gennes on https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20
It is true that the (excited) avalanche phenomena near the critical repose angle has an analogy with a barrier crossing phenomena that can be associated to an equivalent temperature (fig 85.c in https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20). I guess this temperature might indeed represent the probability distribution of the grains energy acquired by the external excitation.

Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

If we consider that any information, in the end, needs a physical substrate to be stored, then effectively the whole world is mechanical and , in the end, any entropy could be related to an energy distribution.
As long as there are no degenerate states, of course ...
So the question about entropy and energy could be translated in:

How much information is stored in degenerate states compared to how much is stored on energy levels? (in our universe)

My guess goes for no degeneracy.
Meaning that history of physics was right on the point since Boltzman: it would make sense to give energy dimensions to entropy!
 
Last edited by a moderator:
  • #118


lalbatros said:
I have at least two reasons to enjoy this type of sandpile physics:

Glad to hear it- I find it interesting as well. I was involved with a few soft-matter groups when I worked at NASA.


lalbatros said:
Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

I think you still misunderstand me- what you say above is of course correct, but I think you missed my main point, which is that temperature and entropy should not be simply equated with energy. Entropy is energy/degree, and so there is an essential role for temperature in the entropy.

How about this example- laser light. Even though laser light has an exceedingly well-defined energy, it has *no* temperature:

http://arxiv.org/abs/cond-mat/0209043

They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

Again, laser light is a highly coherent state, is resistant to thermalization in spite of interacting with the environment, has a well defined energy and momentum, and yet has no clear entropy or temperature.
 
  • #119


Andy Resnick said:
They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

I think there should be no problem to define the entropy, even though the temperature might be totally undefined.

It is clear that entropy is not a function of energy in general.
Just consider the supperposition of two bell-shape distribution.
What is the "temperature" of this distribution?
Even when the two distributions are Maxwellians, you would still be forced to describe the global distribution by three numbers: two temperatures and the % of each distribution in the total.
This is a very common situation.
Very often there are several populations that do not thermalize even when reaching a steady state (open system).
For example the electron and ion temperatures are generally very different in a tokamak.
Even different ion species might have different distributions in a tokamak, specially heavy ions with respect to light ions.
There might even be two populations of electrons, not to mention even runaway electrons in extreme situations.
In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.
I agree with that with the temporary naive idea that this will be the case when at least approximately S=S(E) .
One can easily built articial examples.
For example, on could constrain a distribution to be Lorentzian instead of Maxwellian, or any suitable one-parameter distribution. Within this constraint S would be a function of E via the one parameter defining this distribution. Temperature should be defined in this situation.
I am curious to see a more physical example in the paper.
I am also curious to think about which "thermodynamic relations" would still hold and which should be removed, if any.

Thanks for the reference,

Michel
 
  • #120


Andy Resnick said:
Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.
Andy Resnick said:
Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have good discussions about this.
The level of nonequilibrium thermodynamics is characterized by local equilibrium (in position space). On this level, dynamics is governed by hydrodynamics, variants of the Navier-Stokes equations. Here temperature exists, being defined as the equilibiruim temperature of an (in the limit infinitesimal) cell. Or, formally, as the inverse of the quantity conjugate to energy.

A more detailed level is the kinetic level, characterized by microlocal equilibrium (in phase space). On this level, dynamics is governed by kinetic equations, variants of the Boltzmann equation. Entropy still exists, being defined even on the more detailed quantum level. Temperature does not exist on this level, but appears as an implicit parameter field in the hydrodynamic limit: The kinetic dynamics is approximated in a local equilibrium setting, by assuming that the local momentum distribution is Maxwellian. The temperature is a parameter in the Gaussian local momentum distribution, and makes no sense outside the Gaussian approximation.
 
  • #121


lalbatros said:
It is clear that entropy is not a function of energy in general.

<snip>

In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.

<snip>

Thanks for the reference,

Michel

My pleasure.

It's possible to recover cleanly defined thermodynamic properties in a nonequilibrium system in certain restricted cases: when the density matrix is block diagonal (if that's the correct term), for example. Conceptually, this is similar to coarse-graining or embedding a dissipative system in a higher-dimensional conservative system.

This only works for linear approximations- the memory of a system is very short (the relaxation time is short), or the Onsager reciprocal relations can be used.

As a more abstract example; we (our bodies) exist in a highly nonequilibrium state: the intracellular concentration of ATP is 10^10 times higher than equilibrium (Nicholls and Ferguson, "Bioenergetics"), which means the system can't be linearized and the above approximation scheme fails. How to assign a temperature? Clearly, there does not have to be a relationship between the "temperature" defined in terms of the distribution function of ATP and 98.6 F.
 
  • #122


A. Neumaier said:
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.

That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
 
  • #123


Andy Resnick said:
That's one of the differences between Mathematics and Science. Lots of equations can be nondimensionalized- for example the Navier-Stokes equation- but the scale factors must be retained in order to reproduce experimental results.
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.
 
  • #124


A. Neumaier said:
Sure, but this doesn't change anything of interest.

By the way, not mathematicians but scientists called physicists take c=1 and hbar=1 when they discuss quantum field theory. And they actually express temperature (and distance) in terms of inverse energy, not in Meter or Kelvin.

Translating to more traditional units is a triviality that can be done (and is done where needed) at the very end.

This scientist (who is occasionally called a Physicist) is familiar with the system of 'natural units'. No scientist I work with (Physicist or otherwise) would ever confuse a mathematical model with the actual physical system. My students often do, as evidenced by comments like "this data doesn't fit the model, so the data is bad".

Models can be simplified to better suit our limited understanding, at the cost of decreased fidelity to the phenomenon under investigation.
 
  • #125


My advisor (a physicist) always suggests for me to use natural units since it's just the general behavior were studying (it's a more theoretical treatment)

I'm not emotionally comfortable with that, but it makes sense for a journal like Chaos. It's an exploratory science, not experimental, more theoretical = more mathematical.

I see a spectrum, not a yes-no situation, but then I'm a dynamicist. My work often involves turning the step function into a hyperbolic tangent.
 
  • #126


Pythagorean said:
My advisor (a physicist) always suggests for me to use natural units [...] My work often involves turning the step function into a hyperbolic tangent.
and I guess if you'd use instead an arc tangent you'd use for the resulting angles natural units, too, and not degrees.

Angles in degrees and temperature in degrees are both historical accidents.
An extraterrestrial civilization will not have the same units - but their natural units will be the same.
 
  • #127


Yeah, I don't use natural units. I like to talk to my othe advisor (experimental biophysics) about specific values when I'm looking for biological motivation.
 
  • #128


lalbatros said:
The factor is there as a connection between a measure of disorder and a measure of energy.

I admit that I don't konw what the focus is in the discussion, but to understand a measure of disorder or information without the classical thermodynamical notions, one still needs a way to quantify data or evidence.

I'd say that what replaces the "energy" in the more general abstract discussion is amount of data, or sample size. Without a notion of complexity in the microstructure, or a means to COUNT microstates, any measure of disorder is ambigous.

One can relate shannon entropy to pure probabilistic settings where one explicitly calculate how the conditional probability (based on a given prior macroscate) of a certain distribution/macrostate, depends on it's shannon entropy. Here there appears naturally a scale factor in front of the shannon entropy in an e^MS_shannon term, where M is the complexity or about of data (lenght of event sequence).

So any redefinition of entropy by convention of scaling to units translates to a different relation between entropy and probability. But I think the interesting and central part is the probability anyway. The entropy is just a convenient way to transform one measure into another measure where the combination of independent systems gets additive instead of multiplicative.

So the relativity of entropy, is then just a transformed view of the subjective view on probability. The only objection I have to the use of entropy in physics is that the conditional nature of the measure is often forgotten and instead one resorts to somewhat diverged abstractions such as fictive ensembles, instead of just sticking to plain counting of evidence that would be the way to go in a proper inferencial view.

/Fredrik
 
  • #129


Andy Resnick said:
According to the fluctuation-dissipation theorem, it regularly does:

http://prl.aps.org/abstract/PRL/v89/i5/e050601
Actually that's the "fluctuation theorem." The fluctuation-dissipation theorem deals with system response to an external force or perturbation away from equilibrium.
 
  • #130


I'm not sure that's a meaningful distinction.
 
  • #131
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know. If I have a container of gas, it has a certain entropy. If somehow (staying classical here), I measure the position and momentum of a single particle as a function of time, then the rest of the gas is characterized by an entropy less than the entropy without the measurement, because of reduced number of particles. Yet the observed particle has no entropy. What happened to the difference in entropies? Its gone, because your knowledge of the system has increased beyond simply knowing the macro thermodynamic parameters.

Also, if you had a gas of 10-pound cannonballs, colliding elastically, without friction (or practically so), and the average distance between them was about a mile, and you had a volume the size of the solar system, I doubt if quantum mechanics would be needed to describe the thermodynamic behavior of this gas.
 
Last edited:
  • #132
Rap said:
Just reading through the thread, I wanted to make the point that knowledge does affect entropy. Its a measure of what you don't know.

The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.
 
  • #133
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same. You will never encounter an inconsistency or problem in any thermodynamic description of a process involving these gases as long as you cannot detect a difference between them. If you do run into an inconsistency, you will have found a way to distinguish them.

This was explained in a paper by Janes - Google "Jaynes" "The Gibbs Paradox"
 
  • #134
Rap said:
I don't agree - a more instructive case is the entropy of mixing. If you have a container of gases with a partition, both sides at the same temperature and pressure and you remove the partition, if the gases are different, the entropy increases as they mix. If the gases are the same, the entropy stays the same. The thing is, if they are different, but you don't know it, you will calculate their entropy to be the same and never run into problems. From an information-theoretic viewpoint, the entropy IS the same.

No. there are two different entropies in the two cases.

More importantly, if the gases are different, they will _not_ behave the same way in any experiment that can distinguish the gases - independent of whether or not anyone knows it,

Indeed, precisely because of this fact one can actually learn from such an experiment that the gases are different, and thus correct one's ignorance. If there were no observable difference, we would never know - and there would be nothing to know since the alleged difference is in such a case only a figment of the imagination, not a scientific property.
 
  • #135
A. Neumaier said:
The entropy of a gallon of raw oil can be computed from the composition of the oil and the pressure and temperature of the containing vessel. It is clearly independent of whether or not you know this composition, temperature and pressure. So, where is the claimed dependence on knowledge?

There is only a dependence on facts, and the computed entropy is correct if the correct facts are used in the computation. There is a dependence on knowledge only in this sense. But in this sense,
everything computed in physics would depend on knowledge, not only entropy.

Well, one can certainly compute an entropy associated with the knowledge about a system:

[itex]S = k log(W)[/itex]

where [itex]W[/itex] is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.
 
  • #136
Andy Resnick said:
I'm not sure that's a meaningful distinction.

Are you talking about the distinction between the fluctuation theorem and the fluctuation-dissipation theorem? Those are two different theorems.

The fluctuation theorem is about statistical fluctuations of entropy.

The fluctuation-dissipation theorem is about the relationship between fluctuations in some state variable and a dissipative force acting on that variable. The paradigm example is Nyquist noise in an electric circuit. A resistor is a dissipative force. The corresponding fluctuation is in voltage: the voltage across a resistor will fluctuate in a way related to the resistance. Another example of the fluctuation-dissipation theorem is Brownian motion: the dissipation here is viscous drag on particles moving through a fluid. The corresponding fluctuation is Brownian motion.
 
  • #137
stevendaryl said:
Well, one can certainly compute an entropy associated with the knowledge about a system:

[itex]S = k log(W)[/itex]

where [itex]W[/itex] is the number of microstates consistent with that knowledge (or classically, the volume in phase space of the set of all states consistent with the knowledge). This is numerically identical to thermodynamic entropy in the case where the system is in thermal equilibrium and the "knowledge" is simply the extensive properties of total energy, volume, number of particles.

Sure, but this simply means that you get the correct entropy precisely when your knowledge is the physically correct one that describes everything there is to know about the system.

If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.
 
  • #138
A. Neumaier said:
So, no matter what someone's knowledge is, it has no effect on the physical entropy of the sytem, but only on the subjective entropy the knower thinks the system has.

Hmm. I'm guess I'm not completely convinced that there is such a thing as physical entropy. Suppose we're doing classical physics, and our system is just 5 particles bouncing around a box. Then there is no reason to bring up thermodynamic concepts of temperature and entropy. But if we expand that to 5000 or 50,000, or 50,000,000,000 particles, then the description of the system in terms of particles with definite positions and velocities just becomes completely unwieldy. So if we give up precise predictions, we can make approximate predictions by using thermodynamical quantities total energy, pressure, entropy, etc. But to me, the entropy is an artifact of how we're modeling that collection of particles; it's not objective. If there is no objective notion of entropy for 5 particles in a box, I can't see how there can be an objective notion of entropy for 50 trillion particles.
 
  • #139
A. Neumaier said:
If your knowledge is something different, you get a different entropy, but the system doesn't behave according to your knowledge but still according to what it really is.

Another point about what you said. You talk in terms of the system behaving according to my knowledge. I'm not sure what that means. Is the behavior that you're talking about the second law of thermodynamics? The way I see it is (in classical physics, anyway) is that if we knew exactly the positions and velocities of all the particles, and we had unlimited computational resources, we could predict the future states of the system without ever invoking thermodynamics. To me, thermodynamics comes into play when we only have gross measures of the state, and we want to make gross predictions about future behavior. So we're trying to do physics at a gross level, with macroscopic inputs and macroscopic outputs. Entropy is in some sense a measure of the neglected level of detail that we left out of the gross macroscopic description. It's hard for me to see how that is a physically objective quantity.
 
  • #140
It's the long argument between thermodynamic entropy and information entropy.

Think of the 5-particle case for example. There is the empirical definition of entropy, carried out by measurements, and there is the statistical mechanical (or information theoretic) explanation of that definition.

In the 5-particle case, how do you measure entropy (or more precisely an entropy change)? You make the container have one wall with some mass that moves and apply a constant external force to it. That gives it a "constant" pressure. You can now measure the volume, and it will be constantly changing, every time a particle hits the wall it gets kicked upward, otherwise, it accelerates downward. The volume is fluctuating around some average value. You know P, V, N, and k, and assuming the gas is ideal, you get the temperature T=PV/Nk. Because volume is fluctuating, the temperature is fluctuating too, about some average value.

Now you add an increment of energy to the system, without changing the pressure. (isochoric work). Like with a paddle wheel that kicks one of the particles just a little bit. Thats your measured dU. Since dV=0 on average, dU=T dS gives you dS, the change in entropy. Since T is fluctuating, dS will fluctuate too about an average. Not sure if that is exactly correct, but you get the idea.

Going to stat mech - the entropy is k ln W, where W is the number of microstates that could give rise to the specified P,V,T of the initial system and dS=k dW/W. This is information theory. For example, you could say S=k ln(2) Y where Y=log2(W) (log2 is log to base 2). Y is then the number of yes/no questions you have to ask to determine the microstate. (Actually the number of yes/no questions according to the "best" algorithm, in which each question splits the number of ways in half). This reminds me that the stat mech explanation is information theoretic.

Anyway, the two expressions will match, at least on average. If you increase your knowledge by following just one of the particles, you will have increased your knowledge of the system beyond that of just P,V,T. The statmech guy will say the number of questions has decreased, therefore the entropy has decreased. The fluctuations in T and V and dS will be able to be correlated somewhat with the fluctuations in the position and velocity of the observed particle. (assume classical for simplicity). The thermo guy will say no, this extra knowledge is "out of bounds" with respect to thermodynamics. So the statmech guy's definition and the thermo guy's definitions do not match, except when the statmech guy's definition stays "in bounds". If we stay "in bounds", then entropy is objective. But I have no problem wandering out of bounds, just to see what happens.

The whole problem of the extensivity of entropy and Boltzmann counting is solved by this. The thermodynamicist simply declares that drawing a distinction between like particles is out of bounds. The fact that quantum mechanics says this is true in principle in the quantum regime is really irrelevant. You can have the thermodynamics of a gas of elastically colliding cannonballs and declare distinguishing them out of bounds, and you're good to go.

Regarding the entropy of mixing, if you have two different particles, and the thermo guy declares that distinguishing their difference is out of bounds, and the statmech guy says that the knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy doesn't yet have the ability to distinguish, the the statmech guy says that any knowledge of their difference is unavailiable, then their definitions match, entropy is objective, and the theory works. If the thermo guy can distinguish difference without going out of his pre-established bounds (i.e. by examining particles one at a time), then the knowledge guy says this knowledge is availiable, entropy is objective, and the theory works. If you have a gas of red and blue elastic cannonballs, and their color does not affect how they behave in a collision, and you accept that their color can be determined without appreciably affecting their velocity and momentum, then you can have a disagreement. The thermo guy will declare such measurements out of bounds, while the statmech guy will say that knowledge is availiable. The thermo guys theory will work, the statmech guy's theory will work, but they will make different calculations and different predictions.

The gas of cannonballs might be a gas of stars in a stellar cluster, and then its best to wear two hats.
 
Last edited:
<h2>1. What is the real second law of thermodynamics?</h2><p>The real second law of thermodynamics states that in any natural process, the total entropy of a closed system will always increase over time. This means that energy will always flow from a state of higher concentration to a state of lower concentration, and that the overall disorder or randomness in a system will tend to increase.</p><h2>2. How does the second law of thermodynamics relate to energy?</h2><p>The second law of thermodynamics is closely related to energy because it describes the direction in which energy will flow in a system. Energy will always flow from a state of higher concentration to a state of lower concentration, and this flow of energy is what drives natural processes and reactions.</p><h2>3. Is the second law of thermodynamics always true?</h2><p>Yes, the second law of thermodynamics is a fundamental law of nature and has been tested and proven to be true in countless experiments. It is a fundamental principle that governs the behavior of energy and matter in our universe.</p><h2>4. Can the second law of thermodynamics be violated?</h2><p>No, the second law of thermodynamics cannot be violated. It is a fundamental law of nature that applies to all natural processes and systems. While it may seem like some systems or processes go against this law, they are actually following the law in a more complex way.</p><h2>5. How does the second law of thermodynamics affect the universe?</h2><p>The second law of thermodynamics has a profound impact on the universe. It is the reason why the universe is constantly changing and evolving, as energy flows from one state to another. It also plays a crucial role in the formation of stars, planets, and other celestial bodies, and is essential for life to exist on Earth.</p>

1. What is the real second law of thermodynamics?

The real second law of thermodynamics states that in any natural process, the total entropy of a closed system will always increase over time. This means that energy will always flow from a state of higher concentration to a state of lower concentration, and that the overall disorder or randomness in a system will tend to increase.

2. How does the second law of thermodynamics relate to energy?

The second law of thermodynamics is closely related to energy because it describes the direction in which energy will flow in a system. Energy will always flow from a state of higher concentration to a state of lower concentration, and this flow of energy is what drives natural processes and reactions.

3. Is the second law of thermodynamics always true?

Yes, the second law of thermodynamics is a fundamental law of nature and has been tested and proven to be true in countless experiments. It is a fundamental principle that governs the behavior of energy and matter in our universe.

4. Can the second law of thermodynamics be violated?

No, the second law of thermodynamics cannot be violated. It is a fundamental law of nature that applies to all natural processes and systems. While it may seem like some systems or processes go against this law, they are actually following the law in a more complex way.

5. How does the second law of thermodynamics affect the universe?

The second law of thermodynamics has a profound impact on the universe. It is the reason why the universe is constantly changing and evolving, as energy flows from one state to another. It also plays a crucial role in the formation of stars, planets, and other celestial bodies, and is essential for life to exist on Earth.

Similar threads

Replies
4
Views
901
Replies
12
Views
1K
  • Thermodynamics
2
Replies
46
Views
2K
Replies
2
Views
864
Replies
100
Views
6K
Replies
6
Views
954
Replies
9
Views
739
  • Thermodynamics
Replies
2
Views
675
Replies
13
Views
2K
Replies
13
Views
1K
Back
Top