What is the real second law of thermodynamics?

In summary: assuming you're asking about the entropy of a system in equilibrium, then it is absolutely true that the entropy will increase over time.
  • #71


Andy Resnick said:
SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems
So does statistical mechanics (SM in your post); it is not restricted to computing partition functions and what can be deduced from it.

See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.
 
Science news on Phys.org
  • #72


Andrew Mason said:
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated. ...

This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
 
  • #73


A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.

Right- because living systems are *open*.

Clearly, the entropy associated with me is much lower than a puddle containing all of the atoms in me. Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
 
  • #74


atyy said:
The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.

Fair enough.
 
  • #75


Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.
 
  • #76


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

Thanks, I'll check it out.

I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.
 
  • #77


A. Neumaier said:
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.

I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?
 
  • #78


Andy Resnick said:
That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
Sorry, that was a typo - I meant free energy.

Yes, entropy can change when the temperature is constant - it can both increase and decrease without violating the second law. Whereas the free energy is forced to decrease (or stay constant) by the second law.
 
  • #79


Andy Resnick said:
I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?

You seemed to claim it; but perhaps there is a misunderstanding about the meaning of the terms.

The second law (in its most general, nonequilibrium form) says that the local mean entropy production (a well-defined term in the entropy balance equation) is everywhere nonnegative.

Your remark
Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements.
claims the opposite for biological system. The same seems to be the case in
Andy Resnick said:
A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
On the other hand, your comment
Andy Resnick said:
A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.
Right- because living systems are *open*.
pays lip service to the validity of the second law. So the overall picture is quite confusing.

Indeed, your last-quoted statement is correct. Although the local entropy production is positive (and the second law holds), entropy can decrease in an open system precisely because the system is open - so that entropy can leave the boundaries of the system. But the latter is due to an entropy current, which is not restricted by the second law.
 
  • #80


Andy Resnick said:
I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.

Well, that's the traditional introduction a la Gibbs 1902, who shaped several generations of physicists. Modern books on statistical mechanics often have a nonequilibrium part (especially when this word is in the title). In my 1980 edition of Reichl, the last third (starting in Chapter 13) is about nonequilibrium.
 
  • #81


A. Neumaier said:
Your remark

claims the opposite for biological system. The same seems to be the case in

On the other hand, your comment

pays lip service to the validity of the second law. So the overall picture is quite confusing.

Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.
 
  • #82


Andy Resnick said:
The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.

I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

It's quite possible that entropy, like conservation of energy has no physical motivation (i.e. we can't explain why it is mechanistically we can only state that it's what we observe.

Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.

It appears to me, that entropy has the same such "causal dead end" to it, so far.

addendum:

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).
 
  • #83


Pythagorean said:
I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

<snip>

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).

To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.
 
  • #84


Andy Resnick said:
To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.

I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)
 
  • #85


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
 
  • #86


Pythagorean said:
The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

Pythagorean said:
I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.
 
  • #87


Pythagorean said:
Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.
We know exactly WHY energy is conserved. Conservation of energy follows from the definition of force and the equations for electric and gravitational force.

[tex]\vec F = \frac{d\vec{p}}{dt}[/tex]

[tex]\vec F = \frac{GmM}{r^2}\hat r[/tex]

[tex]\vec F = \frac{qQ}{4\pi\epsilon_0 r^2}\hat r[/tex]

If gravity force and the electric force are the only forces that apply, the quantity KE + PE will be conserved. We know why that is, given those forces. (I am not saying that we know why these forces exist).

The problem only arises when we introduce mechanical forces. But that is not because mechanical forces are different. They are entirely the result of gravity and/or the electric force. The problem is the complexity of the interaction. It is not a matter of the interaction of two masses or two charges. Rather it is the interaction of enormous numbers of little masses and charges and it becomes impossible to calculate the effects of all those little interactions. But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM
 
  • #88


Andy Resnick said:
I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

1. observations don't match theory (this is how Planck motivated discretizing the energy in the blackbody integral)
2. finding the interface between two physical theories (i.e. quantum chaos)
3. finding a mechanism for an observation ("why is there night?" and no, not the proverbial "why"; in this case, the answer would be the motion of the planetary bodies)

I was asking if there was a 3. Why is there entropy?

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.

This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

Andrew Mason said:
But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM

Oh wow, I can't believe I never realized the 1st law of thermodynamics was equivalent to energy conservation. That's a little embarrassing...
 
  • #89


Pythagorean said:
This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

The second law is not derived from classical statistical mechanics. Classical statistical mechanics is a way to predict equilibrium quantities that is unprincipled, but that it reproduces equilibrium thermodynamics from a microscopic viewpoint shows that there is something to it.

The definition of a state variable called entropy, and its non-decrease in an isolated system is derived from the Kelvin statement of the second law of thermodynamics, or equivalently the Clausius statement, which is almost a "plain English" sentence. In contrast to the starting point of classical statistical mechanics, the Kelvin and Clausius statements are supported by lots of data. It is one of the most amazing deductions of physics, certainly on par with Einstein's deduction of special relativity from 2 postulates.

So basically entropy exists because, as Andrew Mason said early in this thread:

Andrew Mason said:
The easiest form of the second law to understand is the Clausius statement of the second law:

"No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature."​

The Kelvin statement can be shown to be equivalent:

"No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work."​
 
Last edited:
  • #90


Ok, this is becoming a lot more physically tangible to me now.

My next step is really to find and look over the Clausius derivation; thank you everyone for your input.
 
  • #91


An exposition of the route from the Kelvin and Clausius statements to the existence of a state variable called entropy and its non-decrease in an isolated system can be found at http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec2.pdf

Wikipedia has a fascinating history of http://en.wikipedia.org/wiki/History_of_entropy . I would be interested to know what people think of Jayne's view.
 
Last edited by a moderator:
  • #92


Andy Resnick said:
Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.

OK. I agree. The point is that open systems not only exchange matter and energy but also entropy, and this explains why they can locally decrease their entropy (or energy or matter density).

Our misunderstanding was about the meaning of the term ''local entropy production''. This is a technical term and describes only that part of the local entropy change that cannot be explained by entropy flow. It is this term that the second law requires to be nonnegative (in a classical setting interpretable as the amount of information irreversibly lost to the neglected microscopic degrees of freedom), no matter how much entropy flows in or out the system.
 
  • #93


Andy Resnick said:
I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.
 
  • #95


A. Neumaier said:
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.

That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
 
  • #96


Andy Resnick said:
That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.
 
  • #97


A. Neumaier said:
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.

Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.

The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
 
Last edited:
  • #98


lalbatros said:
This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
One does not prove physical laws. Mathematics is about proof. Science is about disproof.

In order to debate the issue, one has to start with a definition of a physical law.

I would suggest the following definition: a physical law is:

a) a statement about some physical property or physical phenomenon that purports to be either universally true or true over a certain range of conditions,

b) that leads to predictions of physical behaviour that can be observed, so it would be falsifiable by the existence of contrary physical evidence and

c) for which no physical evidence has yet been found that falsifies it.​

I suggest that the second law of thermodynamics fits this definition.

AM
 
Last edited:
  • #99


I agree with you, Andrew.

However, precisely, I have often asked myself how the second law could be disproved.
That's probably why it has been given different formulations, apparently equivalent.

When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
Does that fit in the definition of a physical law?
Probably yes according to your definition list.
But that doesn't wipe out my problem.
It just means that I have to re-phrase my question.

Consider now the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Clausius_statement" of the second principle:

No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature​
Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?

However, I would be a very ungrateful engineer if I would not recognize the value of the second principle, at least for engineering. But I can harldy consider it as a law of physics. I would rather consider it as a pre-law of physics: a law that still needs to be discovered.
 
Last edited by a moderator:
  • #100


Andy Resnick said:
Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept. But this also holds for equilibrium thermodynamics.

Engineers will always find it simpler to measure the thermodynamic properties of water/steam rather than to derive it from a partition function. But if one wants to understand water in extreme situations where you can't measure it directly, the ab initio methods are useful.

The same holds for nonequilibrium statistical mechanics. It is far easier to measure the viscosity of water than to calculate it from first principles. But, for example, people use the theory to understand inflation after the big bang, where it is impossible to make experiments.
Andy Resnick said:
The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
You might be interested in my survey paper
A. Neumaier,
Molecular modeling of proteins and mathematical prediction of protein structure,
SIAM Rev. 39 (1997), 407-460.
http://arnold-neumaier.at/papers/physpapers.html#protein
 
  • #101


lalbatros said:
When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules to have a Maxwell-Boltzman distribution of energies. So it is not that entropy is irrelevant at the small scale. It is really that it is not defined at the small scale.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
No. If you made an experiment for which one only had to wait a billion years, there is a good chance we could observe it happening somewhere in the universe all the time. The second law is not being violated anywhere in the universe.
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
The second law will not be violated on any time scale with respect any collection of molecules for which a temperature can defined.

Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!
Do you mean if you witnessed an event that has a chance of one in 10^billion universe lifetimes of occurring that you would disprove the second law?

Suppose you thought you witnessed such an event, that heat flowed spontaneously from cold to hot all by itself for a brief instant. Since you could not repeat the result and no one else can repeat it because it never occurs again, what have you proven? You could never really determine whether your result was a mistake or a real observation. The chances are much better that it was a mistake than a real event.

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?
No. The temperature of matter - a collection of molecules - is defined by the Maxwell-Boltzmann distribution. One cannot just make up a definition of temperature. It is a real law based on science not an arbitrary definition.

AM
 
  • #102


Andrew Mason said:
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules
The Boltzmann equation shows that entropy does not require a temperature.

The deterministic version of thermodynamics is valid only for systems in local equilibrium, which requires a macroscopic system with enough molecules to reliably average over them.

Fluctuating thermodynamics (the version defined by statistical mechanics, which is valid for any size) is very little predictive for small systems, since for these it predicts fluctuations of a size that may erase any informative content in the mean values.
 
  • #103


Dear All,

I feel much more confortable to consider that the second law is an engineering heuristics.
And I even don't feel totally confident with this view!
Sorry for that!

To go back again to the Clausius formulation, I believe the full construct of thermodynamics can be derived from this statement. It can lead to the existence of an entropy state function and a thermodynamic temperature scale. But this is all a direct consequence from defining the notion of "hot" and "cold", a consequence of defining hot and cold as the direction for heat. Heat was already known before the second principle.
Where is there real physics in the Clausius statement?
Where is there real information on our physical world?
This Clausius statement is more like instructions for engineers to contruct thermodynamic tables from experiment in an organized way, and first of all it instructed engineers to built a consistent thermometer!
What could you do with the Clausius principle if you had no thermodynamic tables?
You reap the result of the Clausius statement only when a huge amount of experiments have been tabulated (recorded) in a rational way.
That's a huge achievement, but I don't really see any physical law in the Clausius statement.

For me the real physics comes with the statistical thermodynamics, with Boltzmann and Gibbs.
Thermodynamic tables can be calculated ab-initio thanks to their work.
And their work acknowledges clearly the existence of fluctuations.

Michel
 
Last edited:
  • #104


A. Neumaier said:
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept.

Spoken like a true mathematician! :)
 
  • #105


A. Neumaier said:
The Boltzmann equation shows that entropy does not require a temperature.
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM
 

Similar threads

Replies
19
Views
248
Replies
4
Views
971
Replies
12
Views
1K
  • Thermodynamics
2
Replies
46
Views
2K
Replies
2
Views
700
Replies
2
Views
934
Replies
100
Views
6K
Replies
6
Views
1K
Replies
9
Views
845
Replies
13
Views
2K
Back
Top