# Entropy of a continuous system

by dEdt
Tags: continuous, entropy
 P: 186 How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.
 Sci Advisor P: 5,403 If you can assign a temperature (which you can for blackbody radiation), you can define the entropy: http://128.113.2.9/dept/phys/courses...BodyThermo.pdf
P: 970
 Obviously you can't use something like the log of the phase space volume,
Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.

PF Patron
P: 990

## Entropy of a continuous system

What's wrong with phase space volume? You can still write the Hamiltonian for a continuous system - it would just be a field theory now.
P: 186
 Quote by Andy Resnick If you can assign a temperature (which you can for blackbody radiation), you can define the entropy: http://128.113.2.9/dept/phys/courses...BodyThermo.pdf
Thanks for the paper, but I'm looking for a more statistical approach.

 Quote by Jano L. Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.
 Quote by Jorriss What's wrong with phase space volume? You can still write the Hamiltonian for a continuous system - it would just be a field theory now.
Well, the phase space of a field is infinite dimensional. I wouldn't even know how to define volume, and if I could I'd think the volume of basically any region would be infinite.
P: 5,403
 Quote by dEdt Thanks for the paper, but I'm looking for a more statistical approach.

http://home.comcast.net/~szemengtan/...lMechanics.pdf

Section 5.3
P: 786
 Quote by dEdt How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.
Use the entropy density (entropy per unit volume). It will, of course, not be conserved except for reversible situations. In general, there will be a rate of entropy creation per unit volume due to irreversible processes. If s is entropy density, then $$\frac{\partial s}{\partial t}+\nabla \mathbf{J}_s=\frac{\partial s_c}{\partial t}$$ where s is entropy density, $\mathbf{J}_s$ is the entropy flux, and $\partial s_c/\partial t$ is the rate of creation of entropy density (always non-negative).

For example, for a simple fluid, the fundamental law says $dU=T dS-P dV+\mu dN$ where U is internal energy, T is temperature, S is entropy, P pressure, V volume, $\mu$ chemical potential, and N the number of particles. So it follows that $dS=(1/T)dU+(P/T)dV-(\mu/T)dN$ and in terms of densities: $$\frac{\partial s}{\partial t}=\frac{1}{T}\frac{\partial u}{\partial t}-\frac{\mu}{T}\frac{\partial n}{\partial t}$$ where u is internal energy density and n is particle density. And so forth.

In statistical mechanics terms, you are considering each infinitesimal volume element to be an open equilibrated system. To find the total entropy, integrate the entropy density over the total volume.

 Related Discussions Engineering, Comp Sci, & Technology Homework 1 Mechanical Engineering 1 Calculus & Beyond Homework 1 General Physics 2 Classical Physics 6