- #1
sam_bell
- 67
- 0
I was reading about the thermodynamics of the free-electron gas last night, and my mind veered to fundamental concepts of statistical mechanics. I was able to reorganize my knowledge in a way that made everything clearer. In it, energy does not play a role more privileged than any other conserved quantity. I got excited and thought I would share. These ideas are an extension of what is in Kittel & Kroemer.
The fundamental principle of statistical mechanics is that all states of an isolated system are equally likely. A macroscopic state of a system is one which fully characterizes the system at equilibrium, without specifying the exact configuration of all the microscopic degrees of freedom. One of the macroscopic states will have a very large number of corresponding microscopic states, much more than other macroscopic state. By the fundamental postulate, this is the macroscopic state that will be observed. It has the largest entropy S, as measured by S = log(microscopic states). It is from this consideration we derive the basic laws of statistical mechanics.
The first law we are familiar with is that two systems in contact must have the same temperature. If they don't, energy will flow until the temperatures are the same. We can extend this concept to any conserved quantity; like particle number N, volume V, momentum Q, and angular momentum L. Consider two systems, denoted '1' and '2'. The two systems may share particles, volume, and momentum; as long as dN2 = -dN1, dV2 = -dV1, ..., dL1 = -dL2. The combined entropy is S = S1 + S2 = S1(N1,V1,U1,Q1,L1) + S2(N2,V2,U2,Q2,L2). We are looking for the exchange of particle, energy, etc. such that the combined entropy is maximum. This will happen when all partial derivatives of S vanish, i.e.
0 = dS
= (dS1/dN1)dN1 + (dS1/dV1)dV1 + (dS1/dU1)dU1 + dot((dS1/dQ1),dQ1) + dot((dS1/dL1),dL1) + (dS2/dN2)dN2 + (dS2/dV2)dV2 + (dS2/dU2)dU2 + dot((dS2/dQ2),dQ2) + dot((dS2/dL2),dL2)
= [(dS1/dN1)-(dS2/dN2)]dN1 + [(dS1/dV1)-(dS2/dV2)]dV1 + [(dS1/dU1)-(dS2/dU2)]dU1 + dot([(dS1/dQ1)-(dS2/dQ2)],dQ1) + dot([(dS1/dL1)-(dS2/dL2)],dL1) .
The 'dot' product is present because Q and L are vector quantities. For convenience, we define A = (dS/dN)_V,U,Q,L; B = (dS/dV)_N,U,Q,L; C = (dS/dU)_N,V,Q,L; D = (dS/dQ)_N,V,U,L; and E = (dS/dL)_N,V,U,Q. Then maximum entropy requires A1 = A2, ..., E1 = E2. If we look up the definition of chemical potential, pressure, and temperature; we find A = -mu/T, B = P/T, and C = 1/T. The condition A1 = A2, B1 = B2, and C1 = C2 then translates into the aforementioned properties being equal at thermal equilibrium. The vector quantities D and E represent momentum potentials. If they are not equal, there will be a flow of momentum. Now consider the limit in which the second system is so large that the transfer of particles, volume, energy, and momentum has virtually no effect on its macroscopic state. It acts as a reservoir with fixed A2, B2, C2, D2, and E2; forcing the first system to come to equilibrium with a fixed temperature, chemical potential, pressure, etc. For terrestrial thermodynamics, we consider D and E fixed to zero. In a spinning neutron star though, E would be non-zero. #
The second law we are familiar with is the Boltzmann probability distribution. We can derive a generalized Boltzmann law in presence of exchanges other than the energy, like momentum and volume. Again we consider the system ('1') in contact with a reservoir ('2'). The combined system conserves N0 = N1 + N2, V0 = V1 + V2, ..., and L0 = L1 + L2. Because the reservoir is infinitely large, we can consider N1 << N2, V1 << V2, ..., L1 << L2. By the fundamental postulate, the likelihood that the first system is in a given microscopic state 'i' is proportional to the number of states available to the reservoir. This is given by
# states = exp[S2(N2,V2,U2,Q2,L2)] = exp[ S2(N0-N1,V0-V1,U0-U1,Q0-Q1,L0-L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) - (dS2/dN2) N1 - (dS2/dV2) V1 - (dS2/dU2) U1 - dot(dS2/dQ2,Q1) - dot(dS2/dL2,L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) - A N1 - B V1 - C U1 - dot(D,Q1) - dot(E,L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) + mu/T N1 - P/T V1 - 1/T U1 - dot(D,Q1) - dot(E,L1) ] .
Because we are looking for a proportionality, we can drop the fixed constant exp[ S2(N0,V0,U0,Q0,L0) ]. The general probability is then (omitting the label '1')
prob(i) = exp[ -( U_i + P*V_i - mu*N_i )/T - dot(D,Q_i) - dot(E,L_i) ]/Z ,
where Z is a normalization constant, and an explicit dependence of the macroscopic variables on the microscopic state 'i' has been included. Taking D = E = 0, and constraining the available microscopic states to V_i = V, reduces this to the canonical Boltzmann distribution. #
I hope that helps make statistical mechanics clearer for you, as it did for me.
Sam Bell
The fundamental principle of statistical mechanics is that all states of an isolated system are equally likely. A macroscopic state of a system is one which fully characterizes the system at equilibrium, without specifying the exact configuration of all the microscopic degrees of freedom. One of the macroscopic states will have a very large number of corresponding microscopic states, much more than other macroscopic state. By the fundamental postulate, this is the macroscopic state that will be observed. It has the largest entropy S, as measured by S = log(microscopic states). It is from this consideration we derive the basic laws of statistical mechanics.
The first law we are familiar with is that two systems in contact must have the same temperature. If they don't, energy will flow until the temperatures are the same. We can extend this concept to any conserved quantity; like particle number N, volume V, momentum Q, and angular momentum L. Consider two systems, denoted '1' and '2'. The two systems may share particles, volume, and momentum; as long as dN2 = -dN1, dV2 = -dV1, ..., dL1 = -dL2. The combined entropy is S = S1 + S2 = S1(N1,V1,U1,Q1,L1) + S2(N2,V2,U2,Q2,L2). We are looking for the exchange of particle, energy, etc. such that the combined entropy is maximum. This will happen when all partial derivatives of S vanish, i.e.
0 = dS
= (dS1/dN1)dN1 + (dS1/dV1)dV1 + (dS1/dU1)dU1 + dot((dS1/dQ1),dQ1) + dot((dS1/dL1),dL1) + (dS2/dN2)dN2 + (dS2/dV2)dV2 + (dS2/dU2)dU2 + dot((dS2/dQ2),dQ2) + dot((dS2/dL2),dL2)
= [(dS1/dN1)-(dS2/dN2)]dN1 + [(dS1/dV1)-(dS2/dV2)]dV1 + [(dS1/dU1)-(dS2/dU2)]dU1 + dot([(dS1/dQ1)-(dS2/dQ2)],dQ1) + dot([(dS1/dL1)-(dS2/dL2)],dL1) .
The 'dot' product is present because Q and L are vector quantities. For convenience, we define A = (dS/dN)_V,U,Q,L; B = (dS/dV)_N,U,Q,L; C = (dS/dU)_N,V,Q,L; D = (dS/dQ)_N,V,U,L; and E = (dS/dL)_N,V,U,Q. Then maximum entropy requires A1 = A2, ..., E1 = E2. If we look up the definition of chemical potential, pressure, and temperature; we find A = -mu/T, B = P/T, and C = 1/T. The condition A1 = A2, B1 = B2, and C1 = C2 then translates into the aforementioned properties being equal at thermal equilibrium. The vector quantities D and E represent momentum potentials. If they are not equal, there will be a flow of momentum. Now consider the limit in which the second system is so large that the transfer of particles, volume, energy, and momentum has virtually no effect on its macroscopic state. It acts as a reservoir with fixed A2, B2, C2, D2, and E2; forcing the first system to come to equilibrium with a fixed temperature, chemical potential, pressure, etc. For terrestrial thermodynamics, we consider D and E fixed to zero. In a spinning neutron star though, E would be non-zero. #
The second law we are familiar with is the Boltzmann probability distribution. We can derive a generalized Boltzmann law in presence of exchanges other than the energy, like momentum and volume. Again we consider the system ('1') in contact with a reservoir ('2'). The combined system conserves N0 = N1 + N2, V0 = V1 + V2, ..., and L0 = L1 + L2. Because the reservoir is infinitely large, we can consider N1 << N2, V1 << V2, ..., L1 << L2. By the fundamental postulate, the likelihood that the first system is in a given microscopic state 'i' is proportional to the number of states available to the reservoir. This is given by
# states = exp[S2(N2,V2,U2,Q2,L2)] = exp[ S2(N0-N1,V0-V1,U0-U1,Q0-Q1,L0-L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) - (dS2/dN2) N1 - (dS2/dV2) V1 - (dS2/dU2) U1 - dot(dS2/dQ2,Q1) - dot(dS2/dL2,L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) - A N1 - B V1 - C U1 - dot(D,Q1) - dot(E,L1) ]
= exp[ S2(N0,V0,U0,Q0,L0) + mu/T N1 - P/T V1 - 1/T U1 - dot(D,Q1) - dot(E,L1) ] .
Because we are looking for a proportionality, we can drop the fixed constant exp[ S2(N0,V0,U0,Q0,L0) ]. The general probability is then (omitting the label '1')
prob(i) = exp[ -( U_i + P*V_i - mu*N_i )/T - dot(D,Q_i) - dot(E,L_i) ]/Z ,
where Z is a normalization constant, and an explicit dependence of the macroscopic variables on the microscopic state 'i' has been included. Taking D = E = 0, and constraining the available microscopic states to V_i = V, reduces this to the canonical Boltzmann distribution. #
I hope that helps make statistical mechanics clearer for you, as it did for me.
Sam Bell