Deriving entropy change equation from Boltzmann's principle

In summary: I don't really understand what's going on though, so any help would be much appreciated.In summary, Boltzmann's principle states that the sum of changes in temperature and volume is equal to the product of the heat capacity of the material and the number of particles in a gas. With respect to changes in V and T, dS=k.N.\frac{dV}V{}+\frac{C.dT}V{T}.
  • #1
tombrown9g2
9
0

Homework Statement



Show using Boltzmann's principle (S=k.lnW), show that with respect to changes in V and T:

[itex]dS=k.N.\frac{dV}V{}+\frac{C.dT}V{T}[/itex]

Where [itex]W=T^{\frac{C}k{}}V^{N}[/itex]

The Attempt at a Solution



[itex]S=k.lnT^{\frac{C}k{}}V^{N}=k.lnT^{\frac{C}k{}}+klnV^{N}[/itex]

[itex]S=C.lnT+N.lnV[/itex]

Now I know that the differential of lnV is 1/V and lnT is 1/T but I'm unsure on how to do the final step required to get to the equation. My maths is a bit rusty, I've tried taking partial derivatives but couldn't get to the right answer.
 
Last edited:
Physics news on Phys.org
  • #2
[itex]S=C.lnT+k\ N.lnV[/itex] (Don't forget the k). so the first term you have already if you fill in ##d(\ln V) = {dV\over V}##. The other term looks a bit strange: C has the same dimension as k and S, so the rest of that term should be dimensionless like N ln(V), not have a dimension of K2/m3 ??
 
  • #3
BvU said:
[itex]S=C.lnT+k\ N.lnV[/itex] (Don't forget the k). so the first term you have already if you fill in ##d(\ln V) = {dV\over V}##. The other term looks a bit strange: C has the same dimension as k and S, so the rest of that term should be dimensionless like N ln(V), not have a dimension of K2/m3 ??

Ahh yes forgot the k. Probably should of mentioned C is the constant volume heat capacity, k Boltzmann's constant and N the number of particles of ideal gas. Not really looked at the units of the terms but surely if you differentiate then you have to differentiate by something? i.e. to get to ##d(\ln V) = {dV\over V}## surely you've got to differentiate by something first?
 
  • #4
Differentials are limits of small differences. I am speaking physics here. If you want the math side, it all becomes a lot more verbose. For daily use d(f(x)) = f'(x) dx is a way of avoiding to continuously have to state that it really is a limit:$$f' (x) \equiv \lim_{h \downarrow 0} {f(x+h) - f(x) \over h }\ \Rightarrow \ \lim_{\Delta x \downarrow 0} f(x+\Delta x) - f(x) = f'(x)\ \lim_{\Delta x \downarrow 0} \Delta x \ \Rightarrow \ d\left ( f(x) \right ) = f'(x) \ dx$$ So: no, differentiating to something first is not the idea. If you want to differentiate to something (not x, but, say, ##\alpha##), you simply write ## {d(ln(V))\over d\alpha}= {1\over V} {dV\over d\alpha}##, a haphazard way to use the chain rule.

But you can only add or subract things with the same dimension, and you can only take logarithms of numbers, not of e.g. volumes.
 
  • #5
BvU said:
Differentials are limits of small differences. I am speaking physics here. If you want the math side, it all becomes a lot more verbose. For daily use d(f(x)) = f'(x) dx is a way of avoiding to continuously have to state that it really is a limit:$$f' (x) \equiv \lim_{h \downarrow 0} {f(x+h) - f(x) \over h }\ \Rightarrow \ \lim_{\Delta x \downarrow 0} f(x+\Delta x) - f(x) = f'(x)\ \lim_{\Delta x \downarrow 0} \Delta x \ \Rightarrow \ d\left ( f(x) \right ) = f'(x) \ dx$$ So: no, differentiating to something first is not the idea. If you want to differentiate to something (not x, but, say, ##\alpha##), you simply write ## {d(ln(V))\over d\alpha}= {1\over V} {dV\over d\alpha}##, a haphazard way to use the chain rule.

But you can only add or subract things with the same dimension, and you can only take logarithms of numbers, not of e.g. volumes.

Ahh thanks, I see now! So maybe another approach is more appropiate, any ideas? Because I really don't have any.
 
  • #7
Yes, you are right. I have to do some reading up to find out. Originally Boltzmann's W (##\Omega##) was a number, so dimensionless, and you could take the logarithm. But the given W somehow got a complicated dimension.

Anyway, the k N dV/V term can be seen to appear, and it has the dimension of k. But now what about the d/dT term ? C dT/T is not what the original wording of the exercise wants us to find, right ? C has the right dimension (same as k), dT/T is dimensionless, so that would be OK.

But C dT/V times T does not have the right dimension. I don't think I can help in forcing this to have the dimension of k, so I'm stuck here...

In the mean time I find dS/dT = C/T in several places, so I am inclined to think this C dT/V times T isn't right.
 

FAQ: Deriving entropy change equation from Boltzmann's principle

1. What is the Boltzmann's principle?

The Boltzmann's principle is a fundamental concept in statistical mechanics that relates the behavior of individual particles in a system to the macroscopic properties of the system. It states that the most probable state of a system is the one with the highest entropy.

2. How is entropy defined?

Entropy is a measure of the disorder or randomness of a system. In thermodynamics, it is defined as the ratio of heat transfer to temperature, or the amount of energy that is unavailable for work in a system.

3. How do you derive the entropy change equation from Boltzmann's principle?

The entropy change equation is derived by using the definition of entropy and applying it to a system with different possible microstates. By using the Boltzmann's principle, the equation can be written in terms of the number of microstates and the probability of each microstate occurring.

4. What factors affect the entropy change in a system?

The entropy change in a system is affected by several factors, including the number of particles, the energy of the system, and the temperature. Entropy increases with an increase in the number of particles or energy, and with a decrease in temperature.

5. How is the entropy change equation used in practical applications?

The entropy change equation is used in various fields, such as thermodynamics, chemistry, and information theory, to calculate the change in entropy of a system. It is also used to determine the efficiency of a process and to understand the behavior of systems at the microscopic level.

Back
Top