How can I calculate entropy using different methods?

  • Thread starter weiss_tal
  • Start date
  • Tags
    Entropy
In summary, entropy is a measure of disorder or randomness in a system and is calculated by determining the number of possible arrangements or configurations a system can have. It is inversely related to probability and cannot be negative due to the second law of thermodynamics. Entropy has various applications in fields such as thermodynamics, information theory, and statistical mechanics.
  • #1
weiss_tal
5
0
Hello all,

I have a question regarding entropy which I'm sure you guys will have no problem with. I can't get it though... :)

In the literature there are several ways for calculating the entropy, and that really confuses me. for example in the canonical ensemble. One way is to define the the multiplicity as a function of the populated states. i.e g(N,n1,n2,n3...)=N!/(n1!n2!n3!...) and then taking log(g). The population n1,n2... are of course constrained so that the total energy is fixed. And also N=n1+n2+n3...(Stanford Uni. lectures)

Another way is defining the multiplicity g as a function of E_i (possible energy of the system) and again taking the log of it. (Kittel).

Both ways are defining the entropy and from both one can derive the temperature with ds/dE.

My problem with it that they are not the same function. The first one for a given n1,n2... is bound to have some energy E_i but there are more permutations of n1,n2... which gives the same energy E_i. Therefore g(E_i) which is given in the second method is bigger then g(n1,n2..) of the first method.

Which is true? what should I do? or maybe the differences, taking the log of it, are just negligible?

Thank you all very very very much, and forgive me for my awful english,

Tal.
 
Last edited:
Science news on Phys.org
  • #2


Dear Tal,

Thank you for your question regarding entropy and the different ways of calculating it. Entropy is a concept that can be defined in various ways, depending on the context and the system being studied. Both methods you mentioned are valid ways of calculating the entropy, but they may apply to different systems or situations.

In the first method, the multiplicity is defined as a function of the populated states, which is useful for systems with a fixed number of particles and a fixed total energy. This method is commonly used in statistical mechanics and is known as the Boltzmann entropy. It takes into account the different ways that the particles can be arranged in the system while still maintaining the same total energy. This method is often used in the canonical ensemble, as you mentioned.

In the second method, the multiplicity is defined as a function of the possible energies of the system. This method is known as the Gibbs entropy and is commonly used in thermodynamics and statistical mechanics. It takes into account the different ways that the particles can be distributed among the different possible energy levels of the system. This method is useful for systems with a variable number of particles and a variable total energy, and it is often used in the grand canonical ensemble.

In both cases, taking the logarithm of the multiplicity is necessary in order to define the entropy. This is because the multiplicity itself can become very large, and taking the logarithm helps to make the values more manageable.

In summary, both methods are valid ways of calculating entropy, but they apply to different systems and situations. It is important to use the appropriate method for the system you are studying. I hope this helps to clarify the differences between the two methods. If you have any further questions, please don't hesitate to ask.
 

Related to How can I calculate entropy using different methods?

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is commonly used in physics and thermodynamics to quantify the amount of energy in a system that is unavailable for work.

2. How is entropy calculated?

The formula for calculating entropy is S = k * ln(W), where S is entropy, k is the Boltzmann constant, and W is the number of microstates of a system. In simpler terms, entropy is calculated by determining the number of possible ways a system can be arranged or organized.

3. What is the relationship between entropy and probability?

Entropy and probability are inversely related. As the probability of a system being in a particular state decreases, the entropy increases. This is because a lower probability means there are more possible arrangements or configurations for the system, resulting in higher disorder and therefore higher entropy.

4. Can entropy be negative?

No, entropy cannot be negative. The second law of thermodynamics states that the entropy of a closed system will never decrease over time, and can only remain constant or increase. Therefore, entropy is always a positive value.

5. How is entropy used in different fields of science?

Entropy has applications in a variety of fields, such as thermodynamics, information theory, and statistical mechanics. It is used to understand the behavior of physical systems, measure the amount of information in a system, and describe the distribution of energy among particles in a system.

Similar threads

Replies
13
Views
1K
Replies
2
Views
4K
  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Thermodynamics
Replies
7
Views
2K
  • Atomic and Condensed Matter
Replies
3
Views
2K
Replies
2
Views
4K
  • Advanced Physics Homework Help
Replies
6
Views
4K
  • Classical Physics
Replies
1
Views
1K
Replies
3
Views
1K
Back
Top