Sackur-Tetrode - why Planck's constant?

  • Thread starter Rap
  • Start date
  • Tags
    Constant
In summary, the Sackur-Tetrode equation allows for the calculation of absolute entropy for an idealized mono-atomic gas by taking into account the macroscopic parameters of velocity and temperature. The value of Planck's constant, h, plays a role in determining the competition between different terms in the equation, and can affect the likelihood of particles becoming free from spontaneous reactions. However, in thermodynamics, only differences in entropy between two states are measurable and defined, while in statistical physics, there are various incompatible definitions of entropy.
  • #1
Rap
827
10
Looking at the Sackur-Tetrode equation for the entropy of a mono-atomic ideal gas:

[tex]S=kN\left[\log\left(\frac{V (2\pi m k T)^{3/2}}{N h^3}\right)+\frac{5}{2}\right][/tex]

S is entropy, V is volume, N is number of particles, T is temperature, m is mass of a particle, k is Boltzmann's constant and h is Planck's constant.

What is the point of Planck's constant h? Why couldn't it be just any constant with the same units as h? I mean, you cannot measure absolute entropy, only differences in entropy, and when you do that, the value of h does not matter.
 
Last edited:
Physics news on Phys.org
  • #2
The reason we have Planck's constant is because of the uncertainty principle dx dp > h/2.

Rap said:
I mean, you cannot measure absolute entropy, only differences in entropy, and when you do that, the value of h does not matter.

This is not true for idealized gases. For an idealized monatomic gas (in which the particles are indistinguishable and have zero internal degrees of freedom) you can calculate absolute entropy relative to the macroscopic parameters of velocity and temperature. That is what the Sackur-Tetrode entropy measures.
 
  • #3
The role played by h is in adjudicating the competition between the two terms in the parentheses. Hence you could equally ask what the 5/2 is doing in there. There's a T-dependent term, a V/N dependent term, and a constant term, all multiplied by N, so they are each a contribution to the entropy per particle. That matters, because if each particle has access to fewer states (say by raising the value of h), they are less likely to end up as free particles from some spontaneous reaction. For example, if the particles were electrons, they'd be less likely to ionize if h was larger.
 
  • #4
The role played by h is in adjudicating the competition between the two terms in the parentheses. Hence you could equally ask what the 5/2 is doing in there. There's a T-dependent term, a V/N dependent term, and a constant term, all multiplied by N, so they are each a contribution to the entropy per particle. That matters, because if each particle has access to fewer states (say by raising the value of h), they are less likely to end up as free particles from some spontaneous reaction. For example, if the particles were electrons, they'd be less likely to ionize if h was larger.
 
  • #5
Rap said:
I mean, you cannot measure absolute entropy, only differences in entropy, and when you do that, the value of h does not matter.
No, entropy can be measured absolutely. By third law, S=0 at absolute zero. Absolute values of S at other temperatures can be obtained by integrating C_v/T over T.
 
  • #6
IttyBittyBit said:
The reason we have Planck's constant is because of the uncertainty principle dx dp > h/2.

This is not true for idealized gases. For an idealized monatomic gas (in which the particles are indistinguishable and have zero internal degrees of freedom) you can calculate absolute entropy relative to the macroscopic parameters of velocity and temperature. That is what the Sackur-Tetrode entropy measures.

You can calculate the absolute entropy, but can you experimentally measure it?

Ken G said:
The role played by h is in adjudicating the competition between the two terms in the parentheses. Hence you could equally ask what the 5/2 is doing in there. There's a T-dependent term, a V/N dependent term, and a constant term, all multiplied by N, so they are each a contribution to the entropy per particle. That matters, because if each particle has access to fewer states (say by raising the value of h), they are less likely to end up as free particles from some spontaneous reaction. For example, if the particles were electrons, they'd be less likely to ionize if h was larger.

Yes, the constants [itex]2\pi[/itex], k, and 5/2 would be superfluous under the same argument, so my question extends to them as well. I had not been thinking about anything but a single particle gas and the Sackur-Tetrode equation, but you have introduced multiple types, I will have to think about that. I don't immediately see how the h in the Sackur-Tetrode equation relates to the h for ionizing atoms.

I think another way of asking my question is "can you (in principle) measure h thermodynamically, by thermodynamic transformations and measurements of a massive ideal gas confined to a temperature regime far away from quantum effects (e.g. room temperature). I can't think of a way, hence my question. Quantum effects occur when that term in the logarithm is of order 1 or less, then the Sackur-Tetrode equation screws up. Maybe its just a handy way of telling you when things screw up?
 
  • #7
No, entropy can be measured absolutely. By third law, S=0 at absolute zero. Absolute values of S at other temperatures can be obtained by integrating C_v/T over T.

In order to measure entropy absolutely, we have to first define it absolutely.

In thermodynamics, only differences of entropy between two states are measurable and defined.
So in thermodynamics, there is no point in finding the " correct " value of entropy of the state.

In statistical physics, statistical entropy is a different thing. There are many incompatible definitions. We can define it for some model and fix it to be zero at T = 0 and then calculate it for some T > 0. The calculation gives definite value of statistical entropy for the equilibrium state of the model. But we can define statistical entropy otherwise as well and get a different number. It is just a convention.
 
  • #8
DrDu said:
No, entropy can be measured absolutely. By third law, S=0 at absolute zero. Absolute values of S at other temperatures can be obtained by integrating C_v/T over T.

Ok, but you cannot measure anything at absolute zero, so the best you can do is get close and then interpolate to T=0K, hoping that the absolute entropy is in fact practically linear over the regime of interpolation.

Jano L. said:
In order to measure entropy absolutely, we have to first define it absolutely.

In thermodynamics, only differences of entropy between two states are measurable and defined.
So in thermodynamics, there is no point in finding the " correct " value of entropy of the state.

In statistical physics, statistical entropy is a different thing. There are many incompatible definitions. We can define it for some model and fix it to be zero at T = 0 and then calculate it for some T > 0. The calculation gives definite value of statistical entropy for the equilibrium state of the model. But we can define statistical entropy otherwise as well and get a different number. It is just a convention.

I think that amounts to saying the existence of h in the Sackur-Tetrode equation is just a convention.

The Sackur-Tetrode equation does not obey the third law, it gives a zero entropy for a certain small T>0K and negative infinity for T=0K. However, it predicts entropy differences accurately for temperatures much larger than this critical temperature. But then, if its only validity is in predicting entropy differences, why is h necessary? The logarithm of h subtracts out when calculating entropy differences, so any constant with units of action would give the same answer.

I'm starting to think that the existence of h in the Sackur-Tetrode equation serves only as a convenient way of expressing when the Sackur-Tetrode equation goes awry. It's when the term in the logarithm is of order unity or less. If you use 10,000 h as the constant, then it goes awry when the term in the logarithm is 1/10,000 or less. Otherwise, it gives identical results over its region of applicability.

I think entropy is defined in statistical mechanics unambiguously as k times the number of microstates that could give rise to the thermodynamic macrostate. Boltzmann's famous [itex]S=k\ln(W)[/itex]. Its the definition of the microstate where all the trouble occurs. I expect that this means, if I do a quantum mechanical calculation of the number of microstates in an ideal gas, I will get some very accurate equation, and then if take the limit as T approaches infinity, I will get the Sackur-Tetrode equation, which will fail when T is not large. Maybe that is the answer.
 
  • #9
Rap said:
I think entropy is defined in statistical mechanics unambiguously as k times the number of microstates that could give rise to the thermodynamic macrostate.

It is the logarithm of the number of microstates. This distinction is very important, as it makes entropy a measure of information. Specifically, a measure of the missing information after macroscopic variables have been taken into account. Also, it is far from unambiguous, as it depends on your definition of microstate (let's not even get into that).

Here's the reason why the Sackur-Tetrode equation gives negative infinity at very low temperatures.

The Sackur-Tetrode equation gives the level of entropy after macroscopic properties, such as temperature, position, and velocity, have been taken into account to infinite precision. At high temperatures, this is a fair assumption actually, and it works. However, at very low temperatures, what happens is that quantum effects kick in and prevent you from being able to measure the macroscopic position and temperature simultaneously with high accuracy (uncertainty principle). In other words, an accurate measurement of those quantities would constitute having more information than is even present in the system, giving negative entropy. Another way of saying this is that the number of microstates is less than the number of macrostates (which is absurd of course, but is due to our definition of macrostate).

So the root of the problem is not in the mathematics, it is in the definition of entropy, which has to be modified in order to account for the quantum world.

You will see that this answers your question as to whether we can use the Sackur-Tetrode entropy to calculate h.
 
  • #10
Rap said:
I had not been thinking about anything but a single particle gas and the Sackur-Tetrode equation, but you have introduced multiple types, I will have to think about that. I don't immediately see how the h in the Sackur-Tetrode equation relates to the h for ionizing atoms.
It seems to me the purpose of entropy is to characterize the propensity for the particles to be in the kinds of states associated with that entropy. So it is important, when considering the entropy of an ideal gas, to ask what entropy those particles could have in some other state, like if the ideal gas is made of electrons and protons, it could also be in the state of neutral hydrogen. The entropy of free electrons and protons is a crucial factor in determining when hydrogen ionizes, and the value of h is important there. I'm not saying this always makes h important, it was merely one example. If all your gas can do is be in a free state with definite number of particles N, and all you want to know is how its entropy depends on density and T (say for adiabatic expansion problems), then yes, h would be superfluous in that situation, which is why that situation was understood by Boltzmann prior to Planck. But Boltzmann had no way to know when H would ionize, because that does depend on h.
I think another way of asking my question is "can you (in principle) measure h thermodynamically, by thermodynamic transformations and measurements of a massive ideal gas confined to a temperature regime far away from quantum effects (e.g. room temperature). I can't think of a way, hence my question. Quantum effects occur when that term in the logarithm is of order 1 or less, then the Sackur-Tetrode equation screws up. Maybe its just a handy way of telling you when things screw up?
It's not just when things screw up. In phase space, h characterizes the volume of each state, so is crucial for counting states. It tells you (or h3 does) the ratio of the phase space volume to the number of states, so if you have an application where you don't need the number of states, only the ratio of phase space volumes, then the phase space volumes suffice and you don't need h. Boltzmann knew the phase-space volumes, so he could do that. But if the ratio of states involves some kind of change in the gas such that it's not just a ratio of phase space volumes any more, then you need to count the actual states. You need h for that, even in situations very far from conditions where every state is occupied (and the ideal gas assumptions screw up).
 
  • #11
Jano L. said:
In order to measure entropy absolutely, we have to first define it absolutely.

In thermodynamics, only differences of entropy between two states are measurable and defined.

But you have states of which you know that their absolute entropy is zero, namely the states at T=0. And the entropy difference of any state in comparison with these states gives the absolute value of entropy at any temperature.
 
  • #12
Right, and the times when you need absolute entropy is when you can count the states of the competing class of configurations that you want to use entropy to assess. The OP question relates to situations where you can make entropy comparisons without counting states because you have some other handle on relative entropy, like phase-space volume or some such thing, but then you have to compare apples to other apples. If you can count states, you have absolute entropy, and can assess the likelihood of all kinds of different situations.
 
  • #13
DrDu, how do you " know " in thermodynamics that there are states that have zero entropy?

It is known that in thermodynamics, entropy of the state [itex]b[/itex] is defined as an integral of dQ/T over any path [itex]\Gamma_{a,b}[/itex] connecting the two points in the state space of the system:

[tex]
S(b) = S(a) + \int_{\Gamma_{a,b}} \frac{dQ}{T}
[/tex]

You say that for a the state with T = 0, it has to be S(a) = 0.

But experimentally, only the integral is measured. Both entropies are arbitrary; only their difference is absolute. It is the same as with electric potential. There is no one " correct " electric potential anywhere. Only differences between two points are absolute.

Ken G, you are right if you talk about discrete models and Boltzmann-Planck definition S = k ln W, where W is the number of microstates corresponding to the macrostate. Then for the non-degenerate ground state ln 1 = 0. But these are only toy models with discrete definition of entropy. There exist frustrated systems and other entropies. These need not give S = 0.

Also, there is no telling whether the system which seems to be very close to T = 0 is indeed in the lowest minimum of energy. Consider mixture of ortho-para hydrogen with no catalyst that provides for their interaction. The liquid may be close to T = 0 and you would assign S = 0 in a model that neglects magnetic moments. However, in turns out that in few days additional heat is extracted from the system. Then entropy will decrease, but - it was already close to 0, so now it has to be negative.

Insisting that the state _we know_ is the final ground state seems very dangerous. There may always be other state with lower energy, because of so-far undiscovered constraints (like the non-presence of magnetic catalyst above) that prevent the system losing more energy.

If we do not define S(a) absolutely but leave it undetermined and save only differences, no problem will arise.
 
  • #14
Jano L. said:
Ken G, you are right if you talk about discrete models and Boltzmann-Planck definition S = k ln W, where W is the number of microstates corresponding to the macrostate. Then for the non-degenerate ground state ln 1 = 0. But these are only toy models with discrete definition of entropy. There exist frustrated systems and other entropies. These need not give S = 0.
I'm not sure that we are disagreeing, or with DrDru-- entropy is a rather general concept, and comes in many guises. Sometimes it is possible to talk about it in an absolute way, which tend to be more statistical-mechanical because we can actually count states, other times we are stuck (as Boltzmann was) with satisfying ourselves with relative entropies. Thermodynamic entropy, as you say, is an example of the latter, but Sackur-Tetrode is an example of the former.
Insisting that the state _we know_ is the final ground state seems very dangerous. There may always be other state with lower energy, because of so-far undiscovered constraints (like the non-presence of magnetic catalyst above) that prevent the system losing more energy.
Actually, DrDru didn't insist on knowing the ground state, he insisted that the system was at T=0, so must reach its ground state whatever it is. The way we know a system is at T=0 is placing it in contact with another system known to be at T=0. If we cannot know any system is at T=0, then you are right we cannot have absolute entropy, but that doesn't contradict DrDru-- he is merely acting under the assumption that T=0 has a currently accepted meaning (even if it turns out later to be wrong somehow). Even relative entropies could turn out to be wrong if we don't have our T scale right.

If we do not define S(a) absolutely but leave it undetermined and save only differences, no problem will arise.
But we still need a reliable T scale if you want to define it thermodynamically by tracking the heat.
 
  • #15
Jano L. said:
You say that for a the state with T = 0, it has to be S(a) = 0.
The point is that S(a) is independent of a (e.g. volume) as long as T=0. So yes there is a constant for every chemical element or conserved charge if you allow for nuclear reactions but this constant is absolutely arbitrary and is therefore set to 0 in the third law. Its value is also arbitrary in statistical mechanics (all the defining relations for statistical entropy allow for an additional constant).
For the same reason I also don't agree that we can leave S(a) open in cases where we aren't sure whether we have reached the true ground state. Compare e.g. Hydrogen, Oxygen and Water. I can obtain the entropy of hydrogen from that of water and oxygen given that I know the reaction entropy at some temperature (which can easily be inferred from the temperature dependence of the equilibrium constant by van't Hoffs law). Obviously it has to be compartible with the one determined from measurements on elemental hydrogen. So there is no space for an unknown constant.
 
  • #16
Ken G said:
It seems to me the purpose of entropy is to characterize the propensity for the particles to be in the kinds of states associated with that entropy. So it is important, when considering the entropy of an ideal gas, to ask what entropy those particles could have in some other state, like if the ideal gas is made of electrons and protons, it could also be in the state of neutral hydrogen. The entropy of free electrons and protons is a crucial factor in determining when hydrogen ionizes, and the value of h is important there. I'm not saying this always makes h important, it was merely one example. If all your gas can do is be in a free state with definite number of particles N, and all you want to know is how its entropy depends on density and T (say for adiabatic expansion problems), then yes, h would be superfluous in that situation, which is why that situation was understood by Boltzmann prior to Planck. But Boltzmann had no way to know when H would ionize, because that does depend on h.
It's not just when things screw up. In phase space, h characterizes the volume of each state, so is crucial for counting states. It tells you (or h3 does) the ratio of the phase space volume to the number of states, so if you have an application where you don't need the number of states, only the ratio of phase space volumes, then the phase space volumes suffice and you don't need h. Boltzmann knew the phase-space volumes, so he could do that. But if the ratio of states involves some kind of change in the gas such that it's not just a ratio of phase space volumes any more, then you need to count the actual states. You need h for that, even in situations very far from conditions where every state is occupied (and the ideal gas assumptions screw up).

Ok, then the bottom line is that you don't really need h in the ST equation for a single-species gas, but when dealing with reacting multiple-specie gases, you will need it. That means that h can be measured thermodynamically by exercising and measuring the system, even far from the critical temperature, where things go awry.

That would also mean that when deriving the ST equation for a single specie gas, any argument that introduces h will be somewhat hand waving. The reason I ask this is because I am trying to derive the ST equation for a single specie gas. Every derivation I have seen seems to make sense until the h term is introduced and then it starts to look very hand-wavy to me. Things like "we need a term in the denominator with dimensions of action that is, like, graininess in phase space, so, of course, h is the number we are looking for". Ok, but why not [itex]\hbar/2[/itex] since [itex]\Delta x \Delta p\ge \hbar/2[/itex]. And the equality only holds for Gaussians, and the wave function in a box is not Gaussian so why not something a little larger?

I will have to look into multi-specie gases to get a real understanding, I guess.
 
  • #17
I don't think I've ever seen an argument for why the phase-space volume of a state is exactly h-cubed, it does seem rather hand-wavy. I'll accept that more rigorous demonstrations are possible, or that experiment has verified this (say in hydrogen ionization situations), but I haven't seen a formal argument either. It doesn't matter if the actual wavefunctions are Gaussian or not, because the particles are indistinguishable so there really aren't any "particle wavefunctions" anyway, that's all just a kind of fiction that gets the right answer. A correct derivation must write the full wavefunction of the entire ensemble, and I'll bet all that is going to matter is N/V and T, nothing about the individual wavefunctions because those don't exist anyway, not in a gas in equilibrium. Or put differently, if the gas is in equilibrium, it will have reached the Heisenberg uncertainty limit, since that is a maximum entropy situation.
 
  • #18
Ken G said:
I don't think I've ever seen an argument for why the phase-space volume of a state is exactly h-cubed, it does seem rather hand-wavy.

There exists a semi-classical expansion of the density matrix (Wigner Weyl formalism) which allows to justify this. See, e.g., Franz Schwabl, Statistical Mechanics.
 
  • #19
Ken G said:
I don't think I've ever seen an argument for why the phase-space volume of a state is exactly h-cubed, it does seem rather hand-wavy. I'll accept that more rigorous demonstrations are possible, or that experiment has verified this (say in hydrogen ionization situations), but I haven't seen a formal argument either. It doesn't matter if the actual wavefunctions are Gaussian or not, because the particles are indistinguishable so there really aren't any "particle wavefunctions" anyway, that's all just a kind of fiction that gets the right answer. A correct derivation must write the full wavefunction of the entire ensemble, and I'll bet all that is going to matter is N/V and T, nothing about the individual wavefunctions because those don't exist anyway, not in a gas in equilibrium. Or put differently, if the gas is in equilibrium, it will have reached the Heisenberg uncertainty limit, since that is a maximum entropy situation.

I think that talking about particle wavefunctions in an equilibrium gas is the same kind of fiction as talking about the position and momentum of a particle in a classical gas. The full wavefunction of the entire ensemble is, I think, the sum of the individual particle wavefunctions times the Boltzmann factor, because the particles in an ideal gas are non-interacting, and the set of wavefunctions are orthogonal. For example, I think that makes the probability density for a particle's position in an equilibrium ideal gas gas along one axis in a cube of side L equal to (without normalization):

[tex]F(x)\sim\sum_{n=1}^\infty e^{-n^2\epsilon_0/kT}\langle \psi_n|\psi_n\rangle[/tex]

where [itex]\epsilon_0=h^2/8mL^2[/itex] is the ground state energy and

[tex]|\psi_n\rangle \sim \sin\left(n\pi x/L\right)[/tex]

is the wave function for a fixed energy for a particle in the cube. When the Boltzmann factor only becomes appreciable for large n, you get a uniform distribution F(x)=1/L. I'm wondering how far I can go with this until I hit a brick wall, or derive the ST equation.
 

1. What is Sackur-Tetrode equation and how is it related to Planck's constant?

The Sackur-Tetrode equation is a mathematical formula used to calculate the entropy of an ideal gas at a constant temperature. It is related to Planck's constant because the equation incorporates the concept of quantum mechanics, which is based on the fundamental constant h, also known as Planck's constant.

2. How does the Sackur-Tetrode equation support the idea of discrete energy levels in atoms?

The Sackur-Tetrode equation is derived from statistical mechanics, which is based on the idea that particles at the atomic level have discrete energy levels. This equation takes into account the number of energy levels available to the particles, which supports the idea of discreteness.

3. Can the Sackur-Tetrode equation be used to calculate the entropy of non-ideal gases?

No, the Sackur-Tetrode equation is only applicable to ideal gases, which are assumed to have no intermolecular forces and occupy a large volume compared to the size of the molecules. In non-ideal gases, these assumptions do not hold, and the equation cannot accurately calculate the entropy.

4. What is the significance of Planck's constant in the Sackur-Tetrode equation?

Planck's constant, denoted by h, is a fundamental constant in quantum mechanics that relates the energy of a particle to its frequency. It appears in the Sackur-Tetrode equation as a way to incorporate the concept of discreteness and quantum mechanics into the calculation of entropy for ideal gases.

5. How does the Sackur-Tetrode equation provide insight into the behavior of gases at the atomic level?

The Sackur-Tetrode equation takes into account the number of energy levels available to the particles in an ideal gas, as well as their interactions with each other. This provides insight into how particles behave at the atomic level, and how their energy levels and interactions contribute to the overall entropy of the system.

Similar threads

Replies
26
Views
2K
  • Quantum Physics
Replies
23
Views
1K
Replies
5
Views
914
Replies
1
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
3K
  • Quantum Physics
Replies
1
Views
809
  • Introductory Physics Homework Help
Replies
2
Views
810
Replies
1
Views
774
Replies
8
Views
1K
  • Quantum Physics
Replies
9
Views
793
Back
Top