- #1
gemma786
- 29
- 0
Hi,
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
please help me in getting out of this confusion,
thanks.
I want to know a concrete qualitative definition of entropy.
If we define it to be a measure of randomness (disorder) in a system then as per intuition it would mean that a system with less probability in a given microstate will have greater entropy. But as per statistical mechanics
S=-K∑ [Pi log Pi]
It means a system with lesser probability will have less entropy isn’t it?
And how is it possible to express about equation as follows
S=-K log N?
please help me in getting out of this confusion,
thanks.