- #1

- 247

- 79

Last edited:

- I
- Thread starter Monsterboy
- Start date

- #1

- 247

- 79

Last edited:

- #2

Lord Jestocost

Gold Member

- 684

- 483

Abstract

"

"The second law of thermodynamics says that energy of all kinds in our material world disperses or spreads out if it is not hindered from doing so. Entropy is the quantitative measure of that kind of spontaneous process: how much energy has flowed from being localized to becoming more widely spread out (at a specific temperature)."

- #3

- 247

- 79

- #4

- 8,490

- 2,629

A system of many particles in classical physics is completely described by giving a location its location in "phase space". If you give the position and momentum of every single particle, then that gives the phase space location. That's a point in 6N dimensional space if there are N particles, because you have to specify:

- ##x_1, y_1, z_1, p_{x1}, p_{y1}, p_{z1}##
- ##x_2, y_2, z_2, p_{x2}, p_{y2}, p_{z2}##

- etc.

Now, if you don't know precisely what all 6N values are giving the system's location in phase space, you can quantify your ignorance by giving a "volume" in phase space, meaning that the system has a location somewhere in that volume. Boltzmann defined the entropy of a system as ##S = k log(W)## where ##k## is Boltzmann's constant, and ##W## is the volume of the system in phase space, and ##log## means natural log. The bigger that number, the more uncertain you are about the precise location of the system in phase space.

This notion of entropy is subjective, because different people might have different amounts of information about the system, and might use a different volume in phase space.

- #5

Lord Jestocost

Gold Member

- 684

- 483

Why should entropy depend on the observer?I have often heard people say "entropy depends on the observer."

"The entropy of a substance, its entropy change from 0 K to any

- #6

- 8,490

- 2,629

Well, that definition of entropy is a little circular, because ##T## is in turn defined via ##1/T = \frac{\partial S}{\partial U}|_{V}##.Why should entropy depend on the observer?

"The entropy of a substance, its entropy change from 0 K to anyT, is a measure of the energy that can be dispersed within the substance atT: integration from 0 K toTof ∫C_{p}/TdT(+q/Tfor any phase change)." (Frank L. Lambert, "Entropy Is Simple, Qualitatively", J. Chem. Educ., 2002, 79 (10), p 1241)

- #7

Lord Jestocost

Gold Member

- 684

- 483

- #8

- 8,490

- 2,629

The question is: how is ##T## defined?S= dQ/T, where "d" connotes a very small change.

In statistical mechanics, entropy is the primary quantity, and temperature is defined in terms of how entropy changes when you add a small amount of energy.

- #9

Lord Jestocost

Gold Member

- 684

- 483

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).

- #10

- 8,490

- 2,629

Point taken, but there's another issue even after you've chosen the macroscopic variables. Given macroscopic variables ##E, V, N## (total energy, volume and number of particles), there are many (infinitely many in the classical case, and astronomically many in the quantum case) microstates consistent with that macrostate. But are they all equally likely? If not, what's the probability distribution?

When transfering a system from state 0 to state 1 (both characterized by a set of selected macroscopic observables), you can in principle think of any reversible process to define the entropy in state 1:

S_{1}=S_{0}+ ∫δQ_{rev}/T(integration from 0 to 1)

The "subjective" part is merely the definition of the macroscopic observables you want to keep track of for the given system (temperature, pressure, volume, number of particles etc.).

You can just define "equilibrium" so that equal-likelihood is part of the definition, I suppose. Then your claims about entropy are objectively true for a system in equilibrium.

- #11

Lord Jestocost

Gold Member

- 684

- 483

- Replies
- 3

- Views
- 494

- Last Post

- Replies
- 6

- Views
- 882

- Last Post

- Replies
- 12

- Views
- 3K

- Replies
- 7

- Views
- 918

- Last Post

- Replies
- 2

- Views
- 590

- Replies
- 3

- Views
- 3K

- Last Post

- Replies
- 8

- Views
- 2K

- Last Post

- Replies
- 2

- Views
- 4K

- Last Post

- Replies
- 4

- Views
- 6K

- Last Post

- Replies
- 8

- Views
- 3K