- #1
dEdt
- 288
- 2
How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.
Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.Obviously you can't use something like the log of the phase space volume,
Andy Resnick said:If you can assign a temperature (which you can for blackbody radiation), you can define the entropy:
http://128.113.2.9/dept/phys/courses/PHYS4420/BlackBodyThermo.pdf
Jano L. said:Why do you think so? I think you may be right, since the number of Fourier variables (harmonic oscillators) is infinite, which makes the energy infinite.
Jorriss said:What's wrong with phase space volume? You can still write the Hamiltonian for a continuous system - it would just be a field theory now.
dEdt said:Thanks for the paper, but I'm looking for a more statistical approach.
dEdt said:How could the entropy of a continuous system, like the electromagnetic field, be defined? Obviously you can't use something like the log of the phase space volume, but I can't think of anything that would work.
The entropy of a continuous system refers to the measure of disorder or randomness in that system. It is a quantitative measure of the number of possible configurations or states that a system can have.
The second law of thermodynamics states that the total entropy of a closed system always increases over time or remains constant in ideal cases. Entropy is a key concept in this law as it represents the direction of natural processes towards a state of maximum disorder.
The units of entropy in a continuous system are typically Joules per Kelvin (J/K). This indicates the change in energy per unit temperature that is required to increase the disorder or randomness of a system.
In general, an increase in entropy leads to a decrease in stability of a system. This is because higher entropy means more disorder, which can lead to unpredictable or chaotic behavior. However, there are cases where an increase in entropy can actually increase the stability of a system.
According to the second law of thermodynamics, the total entropy of a closed system cannot decrease over time. However, certain subsystems within a larger system can experience a decrease in entropy if there is an input of energy from an external source. This decrease in entropy is offset by an increase in entropy in the surrounding environment.