Entropy of a continuous system

Click For Summary

Discussion Overview

The discussion centers on defining the entropy of continuous systems, particularly the electromagnetic field. Participants explore various approaches and challenges related to statistical mechanics and thermodynamics in this context.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants question the applicability of using the log of phase space volume for continuous systems, suggesting that the infinite number of Fourier variables leads to complications.
  • Others propose that if a temperature can be assigned, as with blackbody radiation, entropy can be defined, although some seek a more statistical approach.
  • A participant suggests using entropy density, noting that it is not conserved except in reversible situations and that there is a rate of entropy creation due to irreversible processes.
  • Concerns are raised about the infinite dimensionality of phase space for fields and the difficulties in defining volume in this context.
  • References to external papers are provided to support various viewpoints, indicating a search for deeper statistical mechanics insights.

Areas of Agreement / Disagreement

Participants express differing opinions on the validity of using phase space volume and the definition of entropy in continuous systems. No consensus is reached, and multiple competing views remain throughout the discussion.

Contextual Notes

Limitations include the unresolved nature of defining phase space volume for infinite dimensions and the dependence on specific assumptions regarding temperature and system behavior.

dEdt
Messages
286
Reaction score
2
How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.
 
Science news on Phys.org
Obviously you can't use something like the log of the phase space volume,
Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.
 
What's wrong with phase space volume? You can still write the Hamiltonian for a continuous system - it would just be a field theory now.
 
Andy Resnick said:
If you can assign a temperature (which you can for blackbody radiation), you can define the entropy:

http://128.113.2.9/dept/phys/courses/PHYS4420/BlackBodyThermo.pdf

Thanks for the paper, but I'm looking for a more statistical approach.

Jano L. said:
Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.

Jorriss said:
What's wrong with phase space volume? You can still write the Hamiltonian for a continuous system - it would just be a field theory now.

Well, the phase space of a field is infinite dimensional. I wouldn't even know how to define volume, and if I could I'd think the volume of basically any region would be infinite.
 
dEdt said:
Thanks for the paper, but I'm looking for a more statistical approach.

Google is your friend:

http://home.comcast.net/~szemengtan/StatisticalMechanics/QuantumStatisticalMechanics.pdf

Section 5.3
 
Last edited by a moderator:
dEdt said:
How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.

Use the entropy density (entropy per unit volume). It will, of course, not be conserved except for reversible situations. In general, there will be a rate of entropy creation per unit volume due to irreversible processes. If s is entropy density, then [tex]\frac{\partial s}{\partial t}+\nabla \mathbf{J}_s=\frac{\partial s_c}{\partial t}[/tex] where s is entropy density, [itex]\mathbf{J}_s[/itex] is the entropy flux, and [itex]\partial s_c/\partial t[/itex] is the rate of creation of entropy density (always non-negative).

For example, for a simple fluid, the fundamental law says [itex]dU=T dS-P dV+\mu dN[/itex] where U is internal energy, T is temperature, S is entropy, P pressure, V volume, [itex]\mu[/itex] chemical potential, and N the number of particles. So it follows that [itex]dS=(1/T)dU+(P/T)dV-(\mu/T)dN[/itex] and in terms of densities: [tex]\frac{\partial s}{\partial t}=\frac{1}{T}\frac{\partial u}{\partial t}-\frac{\mu}{T}\frac{\partial n}{\partial t}[/tex] where u is internal energy density and n is particle density. And so forth.

In statistical mechanics terms, you are considering each infinitesimal volume element to be an open equilibrated system. To find the total entropy, integrate the entropy density over the total volume.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
6K
  • · Replies 3 ·
Replies
3
Views
2K