I How can we best describe entropy other than with disorder?

Tags:
1. Jun 14, 2017

Grajmanu

Entropy is often represented as a representation of disorder in a system or the amount of energy deposited as unusable in a system. What are the other perspectives about entropy?

2. Jun 15, 2017

Staff: Mentor

Statistical physics: entropy is the natural logarithm of the multiplicity of the macrostate.

3. Jun 15, 2017

LaplacianHarmonic

Simply state what forms of energy is disordered first before looking up the definition of Work energy in an engineering physics textbook.

4. Jun 15, 2017

Mgcini Keith Phuthi

I find the statistical physics interpretation more satisfying. Entropy can be considered as the amount of ignorance we have about a system. For example, we can know the system's macrostate (i.e. its temperature, internal energy etc.) but we do not know what microstate it is in. Microstate here means, a certain distribution of either quantum states like energy levels or classically you can think of microstates being described by the individual positions and momenta of each particle. The idea is that while we can always be given a macrostate, there are often many(very many!) more microstates that are consistent with a macrostate. The entropy is then the logarithm of how many of these microstates are consistent with the macrostate (aka multiplicity) and these microstates change with time (Liouville's theorem governs these changes). So the less we know about the positions and momenta of each particle, the more entropy there is. If we fix one particle in a position, we have less entropy as we have reduced the number of available microstates. Entropy is more a product of the statistical intepretation.