Thermodynamics - calculate entropy

In summary: They are based on the theory of statistical thermodynamics, which is not a derivable form. standard entropies of formation of elements are defined to be zero at standard conditions; compounds have non-zero entropies of formation, but they have been measured for a limited set of cases. If you want to calculate third law entropy, you may need to use another method.
  • #1
alpha_wolf
163
0
Hi.

I need to calculate the entropy of some material at a certain temperature given the Cp at each phase and the entalpy change and temperature at phase transitions. I'm supposed to use thermodynamic considerations (i.e. statistical definition of entropy is not applicable/allowed).

I know how to translate the entalpy change to entropy change, so I was thinking of doing an itegral of (Cp/T)dT across each of the temperature ranges, and then summing up the intergals and the entropy changes of the phase transitions. The problem is that the first integral is from 0K to the first transition temperature, so it gives ln(0) as one of the components. This is obviously not usable... I suppose this is because the polynomial definition for Cp breaks down at extremely low temperatures.

How can I overcome the problem? Maybe assume the entropy change near absolute zero is negligible and integrate from 0.1K instead of 0K? The numbers don't quite agree with that assumption...
 
Last edited:
Physics news on Phys.org
  • #2
Yeah --- if you're doing third law entropies. Standard entropies give you a little wiggle room --- standard entropies of formation of elements are defined to be zero at standard conditions; compounds have non-zero entropies of formation, but they have been measured for a limited set of cases. Yours may be among them.

Third law? You've run into Einstein and Debye models for heat capacity at absolute zero? No problem integrating from zero.
 
  • #3
The material in my case is pure zinc. The question doesn't state anything about standard anything, and the chapter is about the third law, so I'm assuming they want third law entropy. Perhaps there's a way to combine third law and standard entropies somehow?

Cp for the first phase is given as A + BT, where A and B are constants. An integral of Cp/T thus gives A*ln(T1) as one of the components, and when T1=0, that is a problem. I don't think we have encountered Einstein and Debye, but perhaps I just don't recognise the name of the model... Can you remind me please?
 
  • #4
Third law S: 0 at 0 K. Standard state: 0 at 298 K. No combination. Hit the library for Ch. 6 in Lewis & Randall, or beat up your text index for Debye, Born and von Karman, Dewar, Einstein, Dulong & Petit (the failure at low T), Nernst.

No stat allowed? This really gets into a gray area --- D. and B. & vK. are not exactly "classical" derivations of the functional form of heat capacity at low T.
 

FAQ: Thermodynamics - calculate entropy

1. What is entropy in thermodynamics?

Entropy is a measure of the disorder or randomness in a system. In thermodynamics, it is a measure of the unavailable energy in a closed system that is not able to do work.

2. How do you calculate entropy?

To calculate entropy, you need to know the change in heat (q) and temperature (T) of a system. The formula for entropy is S = q/T, where S is the entropy in joules per kelvin (J/K).

3. What is the unit of measurement for entropy?

The unit for entropy is joules per kelvin (J/K). This unit is used to measure the amount of energy that is unavailable to do work in a system.

4. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that in any energy transformation, the amount of energy that is unavailable to do work will increase, and the system will become more disordered.

5. Can entropy be negative?

In thermodynamics, entropy is always a positive value. This is because it is a measure of the unavailable energy in a system, and energy cannot be negative. However, in certain situations, the change in entropy can be negative, indicating a decrease in disorder or an increase in available energy in a system.

Back
Top