CraigH
- 221
- 1
On the atomic level, what is entropy? how can I visualise it?
Thanks
Thanks
Believe it or not, information content is the same thing as Shannon entropy, so they both increase together. The idea that the more ordered state contains more information seems intuitive, but it's contrary to the formal meaning of information. One way to think of the "information" is the number of yes/no questions you would need to answer in order to specify the complete state of the system. If the particles are cloistered in one corner, that reduces the number of questions you need to ask to locate them all. Same with a series of binary digits-- a series that repeats contains less information, because once you see the pattern, you don't need to ask any more questions about it, but a random series of n digits requires the maximum, n, number of questions to specify.Naty1 said:Entropy can be vuiewed as a subset of information theory: when one goes up the other goes down. If a gas is uniformly distributed in a conatiner, it's kind of 'uninteresting'...doesnlt hold much information...one part is pretty much the same as all the others.
Studiot said:Do you find the concept of randomness any clearer than that of entropy?
Randomness is also a much misunderstood concept.
And how about the concept of information or the nature of 'states'?
I did like atyy's summary. It fitted a great deal into few words.
I would seriously advise getting a good hold of the thermodynamic version of entropy first as it is much easier to understand. This corresponds to atyy's macroscopic or coarse grained entropy.
CraigH said:On the atomic level, what is entropy? how can I visualise it?
Thanks