Calculating the entropy

  • Thread starter weiss_tal
  • Start date
  • #1
5
0
Hello all,

I have a question regarding entropy which I'm sure you guys will have no problem with. I can't get it though... :)

In the literature there are several ways for calculating the entropy, and that really confuses me. for example in the canonical ensemble. One way is to define the the multiplicity as a function of the populated states. i.e g(N,n1,n2,n3...)=N!/(n1!n2!n3!...) and then taking log(g). The population n1,n2... are of course constrained so that the total energy is fixed. And also N=n1+n2+n3...(Stanford Uni. lectures)

Another way is defining the multiplicity g as a function of E_i (possible energy of the system) and again taking the log of it. (Kittel).

Both ways are defining the entropy and from both one can derive the temperature with ds/dE.

My problem with it that they are not the same function. The first one for a given n1,n2... is bound to have some energy E_i but there are more permutations of n1,n2... which gives the same energy E_i. Therefore g(E_i) which is given in the second method is bigger then g(n1,n2..) of the first method.

Which is true? what should I do???? or maybe the differences, taking the log of it, are just negligible?

Thank you all very very very much, and forgive me for my awful english,

Tal.
 
Last edited:

Answers and Replies

Related Threads on Calculating the entropy

  • Last Post
Replies
0
Views
1K
Replies
1
Views
752
Replies
35
Views
2K
Replies
4
Views
4K
  • Last Post
Replies
6
Views
1K
  • Last Post
Replies
17
Views
1K
  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
9
Views
5K
Top