What is the thermodynamics temperature scale?

AI Thread Summary
The thermodynamic temperature scale is theoretically based on the efficiency of a Carnot engine and is independent of physical substances, with the triple point of water (273.16K) often used as a standard. Practically, temperature is measured using various methods, including the ideal gas law, but it can also be defined through entropy and energy relationships. Discussions highlight that while the ideal gas temperature scale aligns with the thermodynamic scale in practice, it relies on specific properties of gases, whereas the thermodynamic scale is more universal. The concept of negative temperatures arises in certain systems, indicating that a comprehensive definition of temperature remains elusive. Ultimately, the International Temperature Scale (ITS-90) employs multiple definitions to accommodate different temperature ranges, reflecting the complexity of accurately defining temperature.
  • #51
I found finally some time to go through my books: A more practical procedure to determine the absolute temperature than trying to run a carnot machine as reversible as possible is described in Peter T. Landsberg, Thermodynamics and Statistical Mechanics, Dover, NY, 1990, Chapter 6.1. "Empirical and absolute temperature scales". With T being absolute temperature and t being an empirical temperature, the important result (his formula 6.2) is
\ln\frac{T_1}{T_0}=\int_{t_0}^{t_1}\frac{(\partial p/\partial t)_<br /> V}{p+(\partial U/\partial V)_t} dt
The right side does depend only on the empirical temperature (e.g. the length of a mercury filament in a thermometer), but neither on absolute Temperature nor entropy.
The reference Temperature T_0 being fixed arbitrarily, the absolute temperature corresponding to any temperature t may be calculated once the two derivatives have been measured for the intermediate temperature range.
 
Science news on Phys.org
  • #52
I have found some similar equation. One problem is that you need to be able to go from the reference state to the final state with a reversible engine (i.e. constant entropy). That might be troublesome?!

Also note that the derivative in the denominator should be at constant pressure. So you cannot calculate it while running the Carnot engine. To make a change at constant pressure you would destroy the reversibility. Can one even use the definition in that case?
 
  • #53
No, you don't need a reversible engine!
You only have to measure the quantities in the integral for the range of temperature you are integrating over. But this are all standard calorimetric measurements. More importantly, all functions in the integral only depend on the states but are not path dependent.
E.g., to measure the differential dU/dV at fixed t, you could measure the Heat and Work needed to expand some substance at constant temperature. While heat and work may depend on the path taken on their own, their sum will not.
PS: the derivative in the denominator is at fixed temperature t, not fixed pressure p.
 
  • #54
The whole integral depends on path and has to be taken at constant entropy (strictly reversible process). The derivative dU/dV needs to be taken at constant pressure and no other variable. I've done the calculation myself. Maybe you can check the book again.
Meanwhile I try to figure out if your equation is also correct, but I highly doubt it.

EDIT: I have 5 equations of this type and I found you are indeed correct.
 
Last edited:
  • #55
But note that the integral has to be taken at constant volume only, i.e. your change of state variables necessarily needs to go along constant volume.
So you can only find temperature ratios between states of equal volume.

Also dU/dV is taken at constant temperature, which does not lie on the constant volume curve on which the integral has to be. So I'm not sure how to do this in practice?!

EDIT: Actually I agree, that there isn't really a path of the integral. But the derivatives which are taken along different routes (constant volume or constant temperature) have to be taken care of. Is that doable?
 
Last edited:
  • #56
I once read something about the experiments at NIST (if I remember correctly) which lead to the definition of the ITS-90 temperature scale. Seemed to have been a fascinatingly complex experiment. Lamentably, such experiments do not get the attention as e.g. the largest short-circuit experiment at CERN.
I am not an experimentalist, but I don't see any principal problems in evaluating the integral we were talking about experimentally. E.g. for a (real) gas, the pressure and the derivatives dp/dt and dU/dV vary smoothly with V and t. So you could measure them on a grid of points V and t, and interpolate between these points. Then you numerically integrate your formula and you are done.
 
  • #57
DrDu said:
I am not an experimentalist, but I don't see any principal problems in evaluating the integral we were talking about experimentally. E.g. for a (real) gas, the pressure and the derivatives dp/dt and dU/dV vary smoothly with V and t. So you could measure them on a grid of points V and t, and interpolate between these points. Then you numerically integrate your formula and you are done.
Interesting. Actually you could make constant t steps to calculate dU/dV followed by constant V steps to calculate dp/dt. For the first kind, the variable t doesn't change, which does the trick.

As I said I have different such equations (including the one I wanted first), but I wasn't aware of how practical your particular equation is. Thanks for pointing that out.

Now I can go back to derivation and look where it comes from. Because temperature seems to pop out of nothing for a completely arbitrary system!
 
  • #58
As I already said, it is a consequence of the fact that the inverse of absolute temperature is defined so as to be an integrating factor for dQ in a reversible process, i.e. dS=dQ / T(t). I'll sketch the derivation: For a reversible process, dQ=dU+pdV, now express dU=(\partial U/\partial t)_V dt +(\partial U/\partial V)_t dV. Use also that \partial/\partial t=dT/dt \partial /\partial T. Hence dS can be written as dS=Adt+BdV. For dS to be a total differential, \partial A/\partial V=\partial B/\partial t. Solve this condition for T(t).
 
  • #59
I mean that's all just letters. You could take this definition and apply it to a herd of bisons. V being the number of bisons. E being the total age of the bisons. Then you can define any social dynamics between groups of them and calculate things like "temperature"!
Two interacting groups of bisons will have the same parameter "temperature".

That's what I find surprising. It's all just maths with hardly any assumptions and in fact no assumption that relate it to any particular physics.
 
  • #60
The modern formulation of the second law, named after Caratheodory, says that not all states in the neighbourhood of a given state can be reached from it, without the exchange of heat. To reach these states, the system has to loose heat. This shows that heat or some function of it, should be a function of state, which we name entropy. Heat is not a function of state, whence we need an integrating factor. That it has to be a function of empirical temperature is easy to show, the rest is mathematics. Both the books by Buchdahl and by Landsberg, which I already cited elaborate this view in detail (although I even prefer the original article by Caratheodory). That the efficiency of a thermodynamic cycle is limited by the absolute temperatures of the reservoirs follows as a corollary.
However, from classical thermodynamics it is not clear, why for an ideal gas T=pV/nR.
 
  • #61
DrDu said:
To reach these states, the system has to loose heat.
What is heat? The best definition I came up with is "heat is what is left over after you subtract all known energy contributions", i.e.
dQ=dU + pdV

DrDu said:
This shows that heat or some function of it, should be a function of state, which we name entropy.
Are you sure? I think it doesn't yet show that heat is a one-dimensional variable of state. States could have different heat flows which do not add up to zero if you do a cycle. Even a function of heat doesn't do it. Just imagine some whacky system that has random transition heats between all pairs of states. There is no state variable possible.

DrDu said:
However, from classical thermodynamics it is not clear, why for an ideal gas T=pV/nR.
For that you need to restrict yourself to a particular system, of course.
 
  • #62
1. Yes, Caratheodory tried to avoid heat at all in his work and always used the difference between the differential of internal energy and work in his paper, instead.

2. Obviously heat itself is not a state variable and I did say so, but some function of it, involving also temperature, namely dS=dQ/T, is the differential of a function of state. As concerning your whacky systems, which performs random transitions between states, these don't exist in equilibrium thermodynamics.

I don't see what you are about. It is impossible to explain thermodynamics on five lines in a forum and at the same time cover all the loopholes. For that purpose there still exist these old fashioned things called books. I think I already provided the relevant references.
 
  • #63
DrDu said:
2. Obviously heat itself is not a state variable and I did say so, but some function of it, involving also temperature, namely dS=dQ/T, is the differential of a function of state. As concerning your whacky systems, which performs random transitions between states, these don't exist in equilibrium thermodynamics.
Claiming they don't exist is easily said. You have to make a full proof that equilibrium thermodynamics requires some function of heat to be a variable of state. I general there is no mathematical reason why a function of heat should be a variable of state. You have to make many additional assumptions about the dynamics to deduce that.

DrDu said:
For that purpose there still exist these old fashioned things called books. I think I already provided the relevant references.
You are just not aware that some things you believe are just special cases derived by some clever people. You should make your own thoughts and not memorize book quotes.
 
  • #64
Gerenuk said:
Claiming they don't exist is easily said.

It also happens to be correct ...

You have to make a full proof that equilibrium thermodynamics requires some function of heat to be a variable of state. I general there is no mathematical reason why a function of heat should be a variable of state. You have to make many additional assumptions about the dynamics to deduce that.

No, this is taken care of by the laws of thermodynamics.

0th law: existence of relative quantity temperature, which describes a thermodynamic state variable that is equal for any systems in thermodynamic equilibrium

3rd law: demonstrates that there must exist an absolute scale for the thermodynamic state variable temperature.

1st law: Conservation of thermodynamic state variable "energy" --> since work is not a state variable, this implies the existence of another quantity that also has units of energy, and accounts for "transferrable energy not expressible as work". That is what we call "heat" in thermodynamics (as you basically said in an earlier post).

2nd law: The entropy of the universe must increase for any spontaneous process in nature. Many different equivalent statements of the 2nd law can be made, one of which is, "heat cannot spontaneously flow from a body at low temperature into a body at high temperature". Since temperature is a state function, and the 2nd law is not path-dependent, this means that there MUST BE as state function that is expressible as a function of heat.

Thus, these fundamental laws of nature provide support for DrDu's statements.

You are just not aware that some things you believe are just special cases derived by some clever people. You should make your own thoughts and not memorize book quotes.

Hmmm ... you may not care, but I for one have found DrDu's statements to be much more well-formed, cogent, and convincing that any of your own arguments. You need to learn what is in the books before you can critically analyze it. There is not a general conspiracy to only teach or write about "special cases" in scientific texts ... most authors are quite good about stating their assumptions out front. As far as I can tell, the derivations DrDu has provided are *general*, and require only the assumptions he noted in his posts (i.e. reversibility).
 
  • #65
What is heat? The best definition I came up with is "heat is what is left over after you subtract all known energy contributions",

Yes, that is correct. Thermodynamics has to be understood as a phenomenological description within which you cannot gain a deeper understanding. It merely postulates the existence of heat, entropy, temperature etc.

The only way to really undersand this topic is to adopt the information theoretical point of view. I.e. you have a physical system that can only be described exactly by specifying an astronomical amount of information. You then want to describe the system in an effective way by keeping only a very limited amount of information. This can, of course, only be done by making some assumptions about the statistical behavior of the system (e.g. ergodicity etc. etc.)

The fact that this is the only correct way to understand the topic can readily be seen from Maxwell's Demon thought experiment.
 
  • #66
SpectraCat said:
1st law: Conservation of thermodynamic state variable "energy" --> since work is not a state variable, this implies the existence of another quantity that also has units of energy, and accounts for "transferrable energy not expressible as work". That is what we call "heat" in thermodynamics (as you basically said in an earlier post).
Again here are many assumptions that might be true for the most ordinary thermodynamic gas, but are not part of the general definition. Equal energy doesn't have to correspond to identical states. Just because there is work and energy, it doesn't follow that energy minus work is a variable of state, yet. Special conditions are needed to ensure that. Or do you have a mathematical proof in your notes?

SpectraCat said:
2nd law: The entropy of the universe must increase for any spontaneous process in nature.
I suppose you have never tried to define "spontaneous"? And here you see where I claim that some people are stuck with formulations from books, and do not really know what they mean.
The word "spontaneous" contains many different assumption that make the 2nd law valid for only a very small class of homogeneous physical objects like gases or spin state.

SpectraCat said:
Hmmm ... you may not care, but I for one have found DrDu's statements to be much more well-formed, cogent, and convincing that any of your own arguments.
Maybe that's because whenever someone is reciting what you saw in your own books, then you find it well-formed. I can copy a sentence from Weinberg and you will honor me. But if something is not a sentence from an undergrad book you might find it confusing.
For example. some people here have confirmed that for example the entropy of a system for sure will decrease in the very far future. That's not written in introductory undergrad books. If you believe that entropy always strictly increases, then you might not have put some thought in its origins.

SpectraCat said:
There is not a general conspiracy to only teach or write about "special cases" in scientific texts ... most authors are quite good about stating their assumptions out front.
That doesn't imply that most students are good at picking them up.

SpectraCat said:
As far as I can tell, the derivations DrDu has provided are *general*, and require only the assumptions he noted in his posts (i.e. reversibility).
Surely not. For example the system might have additional parameters so that equal V and equal E do not have to correspond to identical systems. There are many hidden assumptions he is missing out.
 
  • #67
Gerenuk said:
Again here are many assumptions that might be true for the most ordinary thermodynamic gas, but are not part of the general definition. Equal energy doesn't have to correspond to identical states. Just because there is work and energy, it doesn't follow that energy minus work is a variable of state, yet. Special conditions are needed to ensure that. Or do you have a mathematical proof in your notes?

I never said energy minus work is a variable of state ... that is clearly incorrect. Heat is not a state function, so there is no inconsistency in my statements. As far as I can see, there is only one "special condition" that I neglected to mention, which is that the "energy=work + heat" formulation requires that there be no exchange of mass (i.e. particles) with the surroundings. In that case, you would need to allow for changes in the energy due to the chemical potential, as well as those due to exchanges of work and heat.

I suppose you have never tried to define "spontaneous"? And here you see where I claim that some people are stuck with formulations from books, and do not really know what they mean.

A workable definition of "spontaneous" in the current context (i.e. thermodynamics of macroscopic systems) is: a spontaneous process is one that moves the system and surroundings closer to thermodynamic equilibrium. That is a completely general definition that is free of any assumptions outside of the zeroth and first laws of thermodynamics. If you have a different definition in this context, I would like to hear it.

I agree that in the context of the fluctuation theorem, there really is no good definition of spontaneous ... all possible changes in the system can be represented as fluctuations, with various weights representing the probability of observing such a change. However, the fluctuation theorem is also consistent with the macroscopic version of the second law that I mentioned earlier, because for any macroscopic system, the integrated probability of entropy-increasing fluctuations is always higher than for entropy-decreasing fluctuations. Thus the net evolution of an isolated system integrated over time will tend towards higher entropy. Furthermore, the probability of observing spontaneous entropy-increasing processes decreases exponentially with the complexity of a system, so for macroscopic systems, the probability of observing entropy-increasing processes is negligible, leading to the common statement of the 2nd law I gave in my earlier post.

The word "spontaneous" contains many different assumption that make the 2nd law valid for only a very small class of homogeneous physical objects like gases or spin state.

Wait, did you just claim that the 2nd law of thermodynamics is restricted only to a few "special cases"? That is certainly not a mainstream view, and requires some detailed support from you. I am unaware of any reputable scientific work that makes such a claim.

Maybe that's because whenever someone is reciting what you saw in your own books, then you find it well-formed. I can copy a sentence from Weinberg and you will honor me. But if something is not a sentence from an undergrad book you might find it confusing.

Ok, you seem to be deliberately trying to provoke those of us debating with you by using statements like the one above. What purpose does that serve? You have made many unsupported (and apparently unsupportable statements), using the excuse that others "wouldn't understand your explanations". Please. Get off your high-horse and support your positions .. I will let you know if I have questions, and I am sure others will do the same.

For example. some people here have confirmed that for example the entropy of a system for sure will decrease in the very far future.

Can you please provide a link to support that? I am aware of the Lohschmidt paradox which seems to imply that the entropy of the universe must have been higher in the past than it is now, but I have not seen any statements that say the entropy will definitely decrease in the future. Unless you are talking about the fluctuation theorem? If so, then I believe your statement above is not generally held to be correct for macroscopic systems.

Surely not. For example the system might have additional parameters so that equal V and equal E do not have to correspond to identical systems. There are many hidden assumptions he is missing out.

That statement is vague and unhelpful and ultimately unconvincing. To which one of DrDu's posts are you referring in the above statement? Please provide an example of how "the system might have additional parameters so that equal V and equal E do not have to correspond to identical systems", so I can understand what you are trying to say.
 
  • #68
Here is an explanation for some things you asked:
Tell me how to apply thermodynamics to the moon orbiting the earth! You might say it doesn't make sense since it is a different field of physics. I agree, it doesn't make sense because the numerous presuppositions of thermodynamics don't apply to this physical system. Now can you mathematically rigorously say which? Have you ever thought about such a question which is not in introductory books?
That's what I mean by you should start making your own thoughts. I do not need to hear about Lohschmidt paradox, but I'd rather hear the "SpectraCat Idea" which is your own argumentation with known physical laws.
I think you agreed that increasing entropy is only a very very likely probabilistic statement? So a sharp decrease in entropy is not impossible and will occur at some point. In fact it is bound to:
https://www.physicsforums.com/showthread.php?t=387938
Maybe the people who answered there can it explain it better. No one else objected in this thread, because it is trivially known to physicists that at an unimaginable distant future time all gas molecules will gather in one corner.
No good thermodynamics book claims that the laws are applicable to just anything. It is only the popular press that transfigures the second law and says that "disorder of any kind must increase". That's why thermodynamics is valid for only very few special problems with enough homogenuity and randomness. Luckily engineering consists of mostly these special cases (gases, ...). Think back about the orbiting moon.
Or do you know a derivation of S=\sum p_i\ln p_i? You apply entropy to a system where the microscopic laws are already known. Therefore it is not admissible to postulate a new law, like this definition or even the second law of TD. It has to be derived with logic. There is a quite simple derivation, which also points out when you can apply this definition.
Now if DrDu would start to mathematically derive all laws from some minimum axioms, then he would notice where assumptions comes in. At some point for example he would say that there are variables of state because the closed contour integral vanishes. This he can only do if he included all parameters of the system. That's what I mean by "maybe E and V are not the only parameters". It could be something like chemical potential or something like a physical property of the gas which is changed by an external magnetic field.
Here is another example where assumptions of thermodynamics reveal:
http://www.aip.org/png/html/maxwell.html

And as long as there are surprises like this or difficulties with knowing how to apply it to something trivial like two orbiting point particles, you do not know all assumptions of thermodynamics.
 
Last edited by a moderator:
  • #69
Gerenuk said:
<snip>
Tell me how to apply thermodynamics to the moon orbiting the earth! <snip>

I had to take up this challenge...

Assuming the earth-moon system is in equilibrium, we apply the first law of thermostatics:

\Delta E=\Delta Q-\Delta W

We then assume that the earth-moon system is adiabatic and closed, and that the moon moves at right angles to the gravitational force. The first law of thermostatics then becomes

\Delta E=0

The energy is given by:

E=1/2mv^2- G m m_e/r

Where we assumed the Earth does not move (m << m_e)

Thus, 1/2mv^2- G m m_e/r is constant, and you will find that this is the usual result from orbital mechanics.

Here, the assumptions are laid bare: equilibrium, closed, adiabatic.
 
  • #70
Andy Resnick said:
Here, the assumptions are laid bare: equilibrium, closed, adiabatic.
How do you justify adiabatic and equilibrium? For that you need a notion of entropy. So how do you define entropy here?
 
  • #71
I justify adiabatic because there is no flow of heat from the universe to the earth-moon system (or vice-versa). I justify equilibrium because the earth-moon system is observed to be stable and unchanging for a long period of time. I justify closed because there is no mass flow into or out of the earth-moon system. Any of those assumptions can be relaxed very easily in the context of thermodynamics, and if you wish to solve that problem, I encourage you to do so- you may learn something useful.
 
  • #72
You are missing out a whole lot of applications of thermodynamics, if in your view heat is only the most simple version. Of course if you define Q=0 for any process, where you don't have a more abstract idea of heat, then thermodynamics is trivial but useless.

However, scientist can extend thermodynamics in an abstract way. A first little step is to examine bouncing balls:
http://www.aip.org/png/html/maxwell.html
They don't really care about the actual temperature of the sand, right? They extend the concept of heat and entropy. You however would say "The sand has does not exchange real heat and the energy is constant. What's the deal?".
But you can also do calculations with a deck of cards or just any system imaginable. For card of course no one wants to speak about the actual temperature that you would feel with your fingers.
So it requires some more thought to do a useful definition of entropy and heat for arbitrary systems.
It's easy to do with S=\ln\Omega. But you have to think about what a useful form of omega is.
 
Last edited by a moderator:
  • #73
Gerenuk said:
You are missing out a whole lot of applications of thermodynamics, if in your view heat is only the most simple version. Of course if you define Q=0 for any process, where you don't have a more abstract idea of heat, then thermodynamics is trivial but useless.

However, scientist can extend thermodynamics in an abstract way. A first little step is to examine bouncing balls:
http://www.aip.org/png/html/maxwell.html
They don't really care about the actual temperature of the sand, right? They extend the concept of heat and entropy. You however would say "The sand has does not exchange real heat and the energy is constant. What's the deal?".

I honestly have no idea what you are trying to say. Eggers didn't "extend" anything ... he used a medium (sand .. actually 1mm spherical plastic beads), which gives a reasonable approximation to hard sphere collisions of a gas, (in fact his system is called a "granular gas"), in order to visualize partitioning of energy between translational and internal degrees of freedom. The sand on one side has more translational energy, but is internally cooler, whereas on the other side, inelastic collisions have caused some of the translational energy to be transferred to the internal degrees of freedom of the sand grains ... thus those sand grains have absorbed energy in the form of heat, and have less translational energy. Once this process starts, it will tend to continue, because the collection of slower grains on one side is more effective at randomizing the translational energy of a faster grain that may come through the hole. Energy *certainly* is not constant in this case ... they are always shaking the box during the experiments, which means energy is transferred into the system (box and sand grains) from the surroundings. Everything in Eggers' experiment and simulation is completely consistent with the normal definitions of heat, temperature and entropy according to macroscopic thermodynamics, and the observed phenomema are consistent with the second law.

But you can also do calculations with a deck of cards or just any system imaginable. For card of course no one wants to speak about the actual temperature that you would feel with your fingers.
So it requires some more thought to do a useful definition of entropy and heat for arbitrary systems.
It's easy to do with S=\ln\Omega. But you have to think about what a useful form of omega is.

The definitions of heat and entropy are consistently applied across all of thermodynamics; there is no need to "define" them for each system. The statistical models describing the partitioning of internal energy among various degrees of freedom that you seem to be so fond of are useful for understanding *why* the observed laws of thermodynamics are correct for a given system. However, contrary to what you have been claiming, one can certainly apply those observational laws successfully without knowing the detailed microscopic behavior of a given system. That is the beauty of statistical thermodynamics: it converges smoothly into macroscopic dynamics in the limit of large systems. Thus the statistical 2nd law (i.e. fluctuation theorem) is completely analogous to the thermodynamics second law for macroscopic systems.

Finally, despite your claims to the contrary, I do not believe that there are any credible scientific sources who have demonstrated thermodynamic second law violations in macroscopic systems. The examples you have mentioned so far involve spontaneous decreases of entropy for isolated systems *of small size*. These systems are therefore subject to the entropy-decreasing fluctuations predicted by the fluctuation theorem, however, the second law in such contexts has a more general form. It simply says that the time-integrated probability of entropy-increasing fluctuations is always greater than the time-integrated probability of entropy-decreasing fluctuations. Thus, even if all the gas-molecules in your earlier example congregated momentarily in the corner of the box, in the next instant, they would disperse again, and the time-integrated entropy would increase or remain constant (if it had already reached its maximum value).
 
Last edited by a moderator:
  • #74
SpectraCat said:
I honestly have no idea what you are trying to say. Eggers didn't "extend" anything ... he used a medium
I wasn't trying to discuss that particular effect. The point is that thermodynamics can be applied to anything if you know how. Do you know how to apply thermodynamics to a set of red and black marbles? You are contradicting yourself if you don't know that, but claim that all of the universe obeys the second law.

SpectraCat said:
The definitions of heat and entropy are consistently applied across all of thermodynamics; there is no need to "define" them for each system.
You are probably talking about your narrow notion of thermodynamics, where it's about heat that can be measured my a mercury thermometer. Do you know how to deal with the marbles then?

SpectraCat said:
Finally, despite your claims to the contrary, I do not believe that there are any credible scientific sources who have demonstrated thermodynamic second law violations in macroscopic systems.
Did you understand the thread link I posted above?
Of course there is no scientific work that shows that TD is violated for tram schedules, deck of cards, ... That's because no-one has claimed that the second law applies strictly to anything beyond the realm of "common sense heat" measurable by mercury thermometers. Likewise there is no scientific work proving that strawberry cannot be used for rocket fuel.

Here is a question to you:
Do you know how to derive
S=\sum p_i\ln p_i
? If so, then you can check for the assumptions made in this proof. These assumptions hardly apply to anything in the real world.

SpectraCat said:
Thus, even if all the gas-molecules in your earlier example congregated momentarily in the corner of the box, in the next instant, they would disperse again, and the time-integrated entropy would increase or remain constant (if it had already reached its maximum value).
How long do you want to time-integrate? If the particles gather in one corner over and over again, what's the point of saying they don't?
 
  • #75
Gerenuk said:
I wasn't trying to discuss that particular effect.

Then why did you bring up that example and link to the article?

The point is that thermodynamics can be applied to anything if you know how. Do you know how to apply thermodynamics to a set of red and black marbles?

I assume here that you are talking about statistics rather than thermodynamics; they are not quite the same thing, but you don't seem to realize that. I certainly understand how to apply statistics to an ensemble of red and black marbles.

You are contradicting yourself if you don't know that, but claim that all of the universe obeys the second law.

What on Earth are you talking about?

You are probably talking about your narrow notion of thermodynamics, where it's about heat that can be measured my a mercury thermometer. Do you know how to deal with the marbles then?

Oh goody .. more insults from you ... and another incorrect statement. Heat is not what thermometers measure.

Did you understand the thread link I posted above?

Yes, I understand it just fine .. I have referred to it several times, and explained in some detail why it does not have the significance you are ascribing to it.

Of course there is no scientific work that shows that TD is violated for tram schedules, deck of cards, ... That's because no-one has claimed that the second law applies strictly to anything beyond the realm of "common sense heat" measurable by mercury thermometers.

Again you say you think thermometers are used to measure heat ... hmmm. Oh, and people have certainly claimed that the second law applies to everything in the universe ... it is widely held to be one of the most fundamental physical laws, and the least likely to be broken. That fact that you don't realize this is rather strange ... if you could find or construct a system that reliably violated the second law, you could make yourself very rich providing free energy to the world.

Likewise there is no scientific work proving that strawberry cannot be used for rocket fuel.

:eek:

Here is a question to you:
Do you know how to derive
S=\sum p_i\ln p_i
? If so, then you can check for the assumptions made in this proof. These assumptions hardly apply to anything in the real world.

The above is an equation from information theory, and provides the definition for information entropy. As written, it has little to do with physics in particular, just statistics of abstract systems, which I guess is what you are trying to say. However, if you put the Boltzman constant out front, and sum over the energy states of a physical systems, with the pi's as the occupation probabilities of those states, then you have the Gibbs entropy from statistical mechanics. I agree that the proof is purely mathematical, but most proofs are, so I don't understand your point. What gives these equations their significance is the correlation between the variables in the equation and "real entities" that have meaning in the physical world. Yes, the same statistics work for a bag containing marbles of two colors, and a vial containing two atomic gases (assuming the temperature is low enough that only one electronic energy state is populated). So what?

How long do you want to time-integrate? If the particles gather in one corner over and over again, what's the point of saying they don't?

The time interval needs to be long enough so that the event in question has a reasonable chance of occurring. For your gas molecule example, if you are talking about a macroscopic sample (say 10^23 molecules), the probability that they will collect spontaneously in the corner of a box is infinitesimal, so you will need to integrate for an awfully long time, much longer than the lifetime of the universe, in order to have any chance of observing it once, let alone multiple times. Again, this whole exercise is an example of applying the fluctuation theorem in a situation where it doesn't really apply, since we are talking about a macroscopic sample. Furthermore, if you are using the FT, then you should use the version of the second law that is consistent with the FT, as I already explained. So there is no contradiction, and the second law is never violated.
 
Last edited:
  • #76
SpectraCat said:
I assume here that you are talking about statistics rather than thermodynamics; they are not quite the same thing, but you don't seem to realize that. I certainly understand how to apply statistics to an ensemble of red and black marbles.
That what I was saying all along. You have no clue that thermodynamics can be applied to completely arbitrary abstract system. Some approach similar to
http://arxiv.org/abs/math-ph/0003028
where the theory is completely left open whether to apply it to a gas or a set of marbles.
But as I do not get the impression that you are ready to acquire new knowledge, this case is closed for me.
 
  • #77
Gerenuk said:
That what I was saying all along. You have no clue that thermodynamics can be applied to completely arbitrary abstract system.

Nothing I have written is consistent with the statement you make above. I am perfectly aware that this is the case. Read my posts carefully, and you will see that this is true.

Some approach similar to
http://arxiv.org/abs/math-ph/0003028
where the theory is completely left open whether to apply it to a gas or a set of marbles.
But as I do not get the impression that you are ready to acquire new knowledge, this case is closed for me.

I have chosen science as a career precisely because I want to acquire new knowledge. I started debating here with you (and put up with your steady stream of arrogant, insulting statements) because I thought you might have something interesting to say, even if you seemed a bit confused at times. In fact, I do appreciate the link above, that paper is very interesting, and it seems that I will certainly learn something new from it, once I have time to analyze it in detail. However I do want to call your attention to a sentence from the second page,
No exception has
ever been found to the second law of thermodynamics—not even a tiny one.
Like conservation of energy (the “first” law) the existence of a law so precise
and so independent of details of models must have a logical foundation that
is independent of the fact that matter is composed of interacting particles.

That passage in bold is basically what I, and Andy Resnick, and DrDu, and perhaps others have tried to tell you. The second law holds for all cases, even though we may not understand all of the details about why it holds.

You seem to have the bizarre point of view that the standard notions of classical and statistical thermodynamics learned from books are somehow useless, or at least less valuable than the "new" approaches you have been advocating here. To the contrary, the reason that I can read and appreciate the article you posted is because I have a decently thorough understanding of classical and statistical thermodynamics, as it is taught in textbooks.
 
  • #78
Why do cosmologist say that the universe was hotter in the past? How do they know?
What about the entropy of the universe?
 
  • #79
We have to watch out for circular reasoning - that's why the laws of thermodynamics are ordered the way they are. You can't define temperature in terms of entropy unless entropy is operationally defined (i.e. a measurement process is specified) which is independent of the concept of temperature - and that's not likely.

Some things have to be taken as given apriori. We have to assume that we know the mechanical parameters that fully describe the state of a system. P, V, and n in the case of a gas. We have to assume that we can thermally, mechanically, and substantially isolate a system, and that we can thermally, mechanically, and substantially connect two or more systems. Let's ignore the transfer of substance, and just deal with thermal and mechanical aspects. (i.e. assume all systems are substantially isolated). We have to assume that we know when a totally isolated system is in equilibrium: i.e. when its mechanical parameters stop changing after a long period of time. We have to assume we can make state transitions reversibly, i.e. very slowly.

The zeroth law says that you can label every set of systems in thermal equilibrium with each other with a unique label - If the labels match, they are in thermal equilibrium, if not, they are not. These labels can be totally arbitrary, they don't even have to be numbers, as long as they are unique for each set of systems in equilibrium.

The first law states that if you mechanically connect a system to something that does work (-PdV) reversibly on the system, but thermally isolate it, then you can assign an energy change (dU) to the system and the total energy is conserved. You can use the mechanical parameters to come up with an energy equation of state - U(P,V) and dU(P,V)=PdV in the case of a gas. If you now thermally connect the system, you define heat as the difference between the (reversible) work done on the system and the internal energy as measured by U(P,V). So now heat \deltaQ is defined: \deltaQ=dU+PdV

The second law is then used to define absolute thermodynamic temperature by using e.g. a Carnot cycle. A Carnot cycle can be described without the use of temperature or entropy, needing only the concepts of work, heat, and energy from the first law. It needs no ideal gas, it works for any system. The absolute thermodynamic temperature is then defined as a function of the labels provided by the zeroth law, and its properties are demonstrated using the second law as applied to the Carnot cycle. Now, having defined temperature, you can get to work on the definition of entropy. You can also experimentally observe that in the limit of low density, gases obey the law that PV/n always takes on the same value, and that value is proportional to the thermodynamic temperature, thus defining an ideal gas. Now you can use the ideal gas as a thermometer to measure temperature and calibrate other thermometers.
 
Last edited:
  • #80
Gerenuk said:
How do you justify adiabatic and equilibrium? For that you need a notion of entropy. So how do you define entropy here?

So this thread is open again?
A system is adiabatically isolated, if it is not influenced by its surrounding (other than by work done on it). So it doesn't matter whether the system is brought to the equator or to the north pole. This definition doesn't presuppose any notion of entropy.
The system is in equilibrium if the macro variables which describe it, don't change in time.
I agree that it is a difficult question to decide what are the thermodynamically relevant variables of a system. There are also systems where one has eventually to modify definitions, e.g. thermodynamics of a star, i.e. a big gravitating object or even non-equilibrium phenomena.
 
  • #81
Rap said:
The first law states that if you mechanically connect a system to something that does work (-PdV) reversibly on the system, but thermally isolate it, then you can assign an energy change (dU) to the system and the total energy is conserved. You can use the mechanical parameters to come up with an energy equation of state - U(P,V) and dU(P,V)=PdV in the case of a gas. If you now thermally connect the system, you define heat as the difference between the (reversible) work done on the system and the internal energy as measured by U(P,V). So now heat \deltaQ is defined: \deltaQ=dU+PdV

For the definition of internal energy it is decisive that the process, while adiabatic, does not need to be reversible. For the same reason I would also generalize Q is defined: Q=\Delta U-W. Which holds also if the process is irreversible and not only for infinitesimally neighbouring equilibrium states.
The definition of U is also not worth a theorem today, as we can in principle measure it directly as the internal energy of a system equals mc^2. So we can in principle measure it using a balance.
 
Last edited:
  • #82
I was just working on thermodynamics and decided to respond without checking the date. Anyway, I agree with everything you said, but I think that to begin with, you have to be able to specify the state without using the quantities defined in the laws. Heat, internal energy, temperature, entropy, etc. That means the mechanical variables only, the ones used to define work. P and V in the case of a gas.
 
  • #83
Rap said:
I was just working on thermodynamics and decided to respond without checking the date. Anyway, I agree with everything you said, but I think that to begin with, you have to be able to specify the state without using the quantities defined in the laws.
I agree on that.
Heat, internal energy, temperature, entropy, etc. That means the mechanical variables only, the ones used to define work. P and V in the case of a gas.
You could in principle also use e.g. refractive index as one of the coordinates which is not directly a mechanical coordinate. So there is still considerable freedom.
 
Back
Top