Register to reply

What does the relation for temperature dE/dS=T mean physically?

by Polyrhythmic
Tags: de or dst, physically, relation, temperature
Share this thread:
Polyrhythmic
#1
Apr1-12, 06:46 PM
P: 342
The only reason I can think of is: By defining [itex]S=k_b \log \Omega[/itex], it just works out; for an ideal gas as well as for other systems (that I can't come up with right now).

[itex]\frac{dS}{dE}=\frac {1}{T(E,...)}\,[/itex] states that temperature is a measure of the increase in entropy when some energy is added to a system. Is there a physical interpretation of this quantity other than the above?
Phys.Org News Partner Physics news on Phys.org
'Squid skin' metamaterials project yields vivid color display
Team finds elusive quantum transformations near absolute zero
Scientists control surface tension to manipulate liquid metals (w/ Video)
Jolb
#2
Apr1-12, 08:50 PM
P: 419
dS/dE=1/T is not correct if you mean a total derivative. Generally,

dE = TdS + Jdx + μdN

[where the bolds are column matrices: J and x are the generalized forces and displacements (respectively); μ and N are the various chemical potentials and number of molecules of each chemical (respectively). Two vectors next to eachother means the dot product.]

So really the equation you're wondering about is

[tex]T = \frac{\partial E}{\partial S}\bigg|_{\mathbf{x, N}} [/tex]
or
[tex]\frac{1}{T} = \frac{\partial S}{\partial E}\bigg|_{\mathbf{x, N}} [/tex]

Anyway, interpreting any equation with entropy can be confusing. In many interpretations that involve S, you can think of it as "a measure of disorder."

What those equations mean kind of falls right out: temperature is given by the change in energy that results from an infinitesimal change in entropy divided by that change in entropy, keeping the number of particles and displacements (e.g. volume) constant. [Not very satisfying, right?]



One possible example to help make this intuitive uses the Third Law, which states that entropy approaches zero as temperature approaches zero. As such, we'll look at limiting cases.

As an example, let's look at a microcanonical ensemble of N "impurities," each with two energy levels (for example, N stationary spin 1/2 particles in a magnetic field). At a finite temperature, there will be a mixture of excited impurities and ground-state impurities and their distribution will be given by the normal Boltzmann factor exp -E/(kT). So as temperature goes to ∞, half will be excited and half will be ground-state, whereas temperature going to 0 ensures they will all be in the ground state.

Holding x and N fixed, let's make a small (positive) change in the energy of this microcanonical ensemble and observe what happens to the entropy in two limiting cases, T→0 and T→∞. This corresponds to the expression for 1/T. To make this small positive change in energy, we would take a ground state impurity and flip it into the excited state. Also remember that we are considering "the thermodynamic limit" N→∞.

In the T→∞ limit
Roughly half of the impurities are excited and the other half are in ground state. Taking one impurity and exciting it wouldn't make that much of a change in how "disordered" the system is... If we looked at a picture of N/2 excited impurities and N/2 ground-state impurities, you probably couldn't tell it apart from a picture of N/2+1 excited impurities and N/2-1 ground-state impurities--they're equally disorderly.

This is the intuitive way of talking about entropy. In terms of microstates, there are roughly the same number of microstates with N/2 excited impurities as N/2+1 excited impurities.

So the change in "disorder" and hence the change in entropy for an infinitesimal change in energy is negligible in the limit T→∞, consistent with the equation you asked about.

In the T→0 limit
All the impurities are in ground state. There is only one microstate such that all the particles are in the ground state! If we excite one impurity, there will be N possible microstates. Hence the change in entropy is on the order of log(N), which approaches ∞ in the thermodynamic limit. Intuitively, even for a huge number of particles, you'd easily be able to see the difference between a picture with N ground state impurities and 0 excited impurities and a picture with N-1 ground state impurities and 1 excited impurity. If all of them are ground state, it is much more orderly. So the change in entropy is huge for an infinitesimal change in energy in the limit T→0, again consistent with the equation you asked about.


Similar arguments hold for intermediate values of T, and for more complicated/less idealized systems.


(If you wanted to make my explanation a little more rigorous, you would want to avoid having the microstate with N+1 excited impurities, since this actually implies a negative value for T [headscratcher alert]. So in that case, instead of exciting one groundstate impurity, it would be better to relax one excited impurity--you'd get a negative on top and bottom of the partial derivative and everything else would still hold. Just a technical point.)


Register to reply

Related Discussions
Relation between temperature and refractive index Introductory Physics Homework 1
Pressure Temperature Relation Mechanical Engineering 2
Temperature/Resitance Relation? General Physics 1
YM relation to temperature? Classical Physics 1
Gas pressure and temperature relation Chemistry 3