- #1
CraigH
- 222
- 1
On the atomic level, what is entropy? how can I visualise it?
Thanks
Thanks
Believe it or not, information content is the same thing as Shannon entropy, so they both increase together. The idea that the more ordered state contains more information seems intuitive, but it's contrary to the formal meaning of information. One way to think of the "information" is the number of yes/no questions you would need to answer in order to specify the complete state of the system. If the particles are cloistered in one corner, that reduces the number of questions you need to ask to locate them all. Same with a series of binary digits-- a series that repeats contains less information, because once you see the pattern, you don't need to ask any more questions about it, but a random series of n digits requires the maximum, n, number of questions to specify.Naty1 said:Entropy can be vuiewed as a subset of information theory: when one goes up the other goes down. If a gas is uniformly distributed in a conatiner, it's kind of 'uninteresting'...doesnlt hold much information...one part is pretty much the same as all the others.
Studiot said:Do you find the concept of randomness any clearer than that of entropy?
Randomness is also a much misunderstood concept.
And how about the concept of information or the nature of 'states'?
I did like atyy's summary. It fitted a great deal into few words.
I would seriously advise getting a good hold of the thermodynamic version of entropy first as it is much easier to understand. This corresponds to atyy's macroscopic or coarse grained entropy.
CraigH said:On the atomic level, what is entropy? how can I visualise it?
Thanks
Entropy is a measure of the disorder or randomness of a system. It is a fundamental concept in thermodynamics and statistical mechanics, and it is used to describe the tendency of a system to move towards a state of maximum disorder.
The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems tend to become more disordered and less organized over time, eventually reaching a state of maximum entropy.
Examples of entropy in everyday life include the melting of ice cubes, the rusting of metal, and the mixing of hot and cold fluids. In these cases, energy is dispersed and the system becomes more disordered, increasing its entropy.
The formula for calculating entropy is ΔS = Q/T, where ΔS is the change in entropy, Q is the amount of heat transferred, and T is the temperature in Kelvin. This formula is based on the relationship between entropy and energy dispersal.
While it is possible for local decreases in entropy to occur, the overall trend is always towards an increase in entropy. This is due to the second law of thermodynamics, which states that all closed systems will naturally tend towards a state of maximum entropy. However, energy can be used to temporarily decrease the entropy of a system, such as in the formation of crystals or the organization of living organisms.