- #1

- 9

- 0

Thank you!

KiltedEngineer

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter KiltedEngineer
- Start date

- #1

- 9

- 0

Thank you!

KiltedEngineer

- #2

- 419

- 29

One way to mathematically define entropy S is to relate it to the probability distribution describing a system: [tex]S=-k \sum_i^N p_i \log(p_i)[/tex]

The index i labels the different possible configurations for a system, of which there are N total, k is Boltzmann's constant, and p_{i} is the probability that the system is in the state i. [The p_{i} are all numbers from 0 to 1 so their logarithms monotonically increase from -∞ to 0, so the entropy is always a number greater than or equal to zero.]

If a system has only one possible state, then N=1 and p_{1}=1, and thus S=0. Thus if we know exactly the configuration of a system, then it has zero entropy, and we can describe this state as "ordered." For example, if we have a crystal which is so cold that all the atoms sit exactly in their lattice sites, then we know exactly the state of the crystal, so its entropy is zero. The regular pattern of the location of the atoms is what we refer to as the crystal's order.

On the other hand, if the system could be in many different states, (i.e. N is big and there are many p_{i}'s which are non-negligable compared to the rest), then the entropy will be large. As an example, when we heat up a crystal, the atoms start wiggling in their lattice sites, so if we were to look at the crystal at various times, the atoms could be in many different possible positions. Thus it has a higher entropy than a cool crystal. There is somewhat less order if the atoms do not conform exactly to the lattice pattern.

The index i labels the different possible configurations for a system, of which there are N total, k is Boltzmann's constant, and p

If a system has only one possible state, then N=1 and p

On the other hand, if the system could be in many different states, (i.e. N is big and there are many p

Last edited:

- #3

- 479

- 33

An example of how microstates and macrostates are related is rolling a pair of dice. There are 11 macrostates of the system, 2-12, the sum of the numbers on the dice. There are 36 microstates, the combinations of ways the two dice can roll. There are 6 microstates that correspond to the macrostate "7", but 1 microstate that corresponds to the macrostate "2".

We can talk about the entropy of a substance, like a gas, without referring to a thermodynamic process. You can think of it like a property, like temperature or pressure. When we measure the temperature of a system, it's really a kind of average of the total energy. Excluding units for a while, let's say the energy of the system is 10 and we have 3 things in our system. All 3 things can have an energy of 10, making the average 10. Also, 1 of them can have energy 28 and the other 2 can have energy 1, making the average 10 again. There are a number of ways we can distribute the energy to the particles (microstates) still get the average (macrostate), just like in the dice example. The entropy is classically defined as the energy divided by the temperature.

Now, we talk about entropy as well when we refer to a thermodynamic process, like heat transfer. Heat is always transferred from a higher temperature to a lower temperature, and as the energy is transferred, the entropy of the "hotter" source is decreased and the entropy of the "colder" source increases. This happens because the number of states is tied to the temperature of the substance. The second law of thermodynamics states that the combined entropy of the total system ("hot" and "cold" sources combined) can never decrease as a result of the heat transfer. Since "disorder" is linked to the number of microstates that correspond to the macrostate, the "disorder" either stays the same or increases.

Share:

- Replies
- 2

- Views
- 2K

- Replies
- 19

- Views
- 16K