Understanding Entropy: Definition, Relationship to S=Q/T, and Impact of dS=dQ/T

  • Thread starter BoyangQin
  • Start date
  • Tags
    Entropy
In summary, entropy is a measure of the number of possible states of a system, and it can be increased by increasing the number of points in space or by increasing the number of possible values that a system's speed can have.
  • #1
BoyangQin
12
0
A line about entropy on wikipedia says:

"The entropy of a molecule is defined as S=K*ln(Ω) , where Ω is the number of states available to the molecule."

what does that "states available" mean?and how does it related to other definition of S, S= Q/T?

BTW,
Does the equation dS = dQ/T mean that if a infinite small Q was passed into a system from the surrounding, then the entropy of the system MUST rise by dQ/t ?
 
Science news on Phys.org
  • #2
First, recall that dS = dQ/T is only valid for a quasistatic process. That said, Wikipedia's page "Entropy (statistical thermodynamics)" gives a derivation in the box on the right side of the page, for a canonical system. It uses the first law to establish the result.

http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics )

Note that the formula used for that derivation,

[tex]S = -k_B\sum p_i \ln p_i[/tex]

reduces to [itex]S = k_B\ln \Omega[/itex] when the system is microcanonical, as then [itex]p_i = 1/\Omega[/itex].
 
Last edited by a moderator:
  • #3
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.
 
  • #4
mikelepore said:
As for your question "What does that 'states available' mean?" --

Textbooks give the example of the free expansion of a gas into a vacuum. They say, due to the volume increase, there are now "more points in space" available for particles to occupy, therefore there are more possible states which are all equally probable, therefore entropy has been increased. Another example the textbooks give, after you add heat to a system, so that the peak of the distribution of particle speeds has moved up, there are now more possible values of speed that molecules can have.

I find it tough to reconcile those explanations with the mathematical form that has the logarithm in it. Any space has an infinite number of points, and any continuous distribution of speeds has an infinite number of possible values. Okay, one infinity can be larger than another infinity, and I don't have any trouble with that. What I have trouble with is visualizing the act of taking the logarithm of that infinity and multiplying it by the Boltzmann constant.

There's a problem related to logs of probability densities. P(x)dx is a probability and so it is unitless, so P(x) has units, and it's weird taking logs of quantities with units. One way to get round that is to say that only entropy differences make sense, then you get log of ratio of P(x)/Q(x) and the units disaappear. More generally, the mutual information is invariant under an invertible change of variables. Or maybe insist that quantum mechanics is true and things are really discrete.
 

1. What is entropy and how is it defined?

Entropy is a measure of the disorder or randomness in a system. It is defined as the ratio of the heat energy absorbed by a system at a given temperature to that temperature.

2. What is the relationship between entropy and the equation S=Q/T?

The equation S=Q/T represents the relationship between entropy, heat energy (Q), and temperature (T). It states that the change in entropy (dS) is equal to the change in heat energy (dQ) divided by the temperature (T).

3. How does the concept of dS=dQ/T impact entropy?

The concept of dS=dQ/T shows that the change in entropy is directly proportional to the change in heat energy and inversely proportional to temperature. This means that as heat energy increases, entropy also increases, and as temperature decreases, entropy increases.

4. How does entropy impact the behavior of a system?

Entropy impacts the behavior of a system by determining the direction in which processes occur. In general, processes tend to occur in the direction that increases the overall entropy of the system. This means that systems tend to become more disordered and random over time.

5. Can entropy be reversed or decreased?

In most cases, entropy cannot be reversed or decreased. According to the Second Law of Thermodynamics, the total entropy of an isolated system will always increase over time. However, it is possible to decrease entropy in a local system by expending energy, but this will result in an overall increase in entropy elsewhere.

Similar threads

Replies
11
Views
1K
Replies
22
Views
1K
Replies
4
Views
951
  • Thermodynamics
Replies
4
Views
2K
  • Thermodynamics
Replies
1
Views
734
Replies
21
Views
4K
Replies
2
Views
844
  • Thermodynamics
Replies
4
Views
1K
  • Thermodynamics
Replies
3
Views
1K
Back
Top