# Entropy. What is it?

by Homesick345
Tags: entropy
 P: 30 Why do we constantly hear about it? can it be explained simply?
 P: 1,262
 PF Patron P: 10,397 This is probably better off in the Classical Physics forum.
P: 743

## Entropy. What is it?

Entropy is not a property of a system in a given quantum state, but rather is a property of a system whose exact quantum state is unknown. It is a measure of how many quantum states look similar to system at hand (e.g. how many quantum states are consistent with certain observational or theoretical constraints).

From a more experimental point of view: in the real world, you never know everything going on in an experiment--you can only measure some finite number of general properties. As time goes along, after you stopped the measurment, things get mixed up within the system and with the environment, and you lose details about the experiment, so the entropy of the system goes up. As things mix up, the system approaches what is called thermal equilibrium, where the system is describable by a single temperature.

Usually, it is the heat energy that is mixed up first (hot things cool off, cold things warm up), so the field of study is known as thermodynamics. But if you wait an infinite amount of time, eventually everything decays, all substances break down and reform out of contituent parts, repeatedly, until every property of the system reaches thermal equilibrium (where thermal refers not only to particle motion but also to particle composition and other properties) and you get a frothy mix of "stuff" in constant remixing. This is the heat death of the universe.
 P: 65 if you had children, you would know...
P: 30
 Quote by sahmgeek if you had children, you would know...
Oh I got 2 little boys.

So ENTROPY means "things getting messy in every little conceivable detail"...???

Or....."No matter where you hind them, your stuff will be found by "entropy" & will be messed with..?"
 PF Patron P: 2,895 The concept behind entropy is simple-- you lump many different states of a system into groups that you don't consider different, like a "messy room." There are many more states of a room that get called "messy" than get called "neat," so we say a messy room has a "higher entropy." If the room state is more or less random, it is vastly more likely to be in a state we'd call "messy", and that's why we have the second law of thermodynamics, it's just probability. What's interesting is that a messy room actually contains more information, because you need explicit instructions about where every single object is. But if you have an alphabetized file cabinet, you get the idea quickly, and can find each thing without needing specific information each time.
 PF Patron Sci Advisor P: 8,903 All I know is if I don't clean my car out for a weak, it takes a week to clean it back up - and then I have to start all over again.
PF Patron
P: 1,942
 Quote by Ken G The concept behind entropy is simple-- you lump many different states of a system into groups that you don't consider different, like a "messy room." There are many more states of a room that get called "messy" than get called "neat," so we say a messy room has a "higher entropy." If the room state is more or less random, it is vastly more likely to be in a state we'd call "messy", and that's why we have the second law of thermodynamics, it's just probability. What's interesting is that a messy room actually contains more information, because you need explicit instructions about where every single object is. But if you have an alphabetized file cabinet, you get the idea quickly, and can find each thing without needing specific information each time.
There is no simple relation between entropy and order, and the popular informal account that you describe is (as much of popular science) highly inaccurate.

For example, consider English text. The entropy is a characteristic property of the written language, not of a single string. Therefore, each string of N characters of an English text has [if this word can be used here at all meaningfully] the same entropy (namely that of a random string of N characters of an English text), no matter how ordered or unordered it is.

In general, entropy is a characteristic property of a probability distribution, not of any of its realizations.

In information theory, entropy is a measure of lack of information. It is the expected number of decisions that would be needed to pin down a particular realization of something, when the probability distribution of all possible realizations is given.

In quantum statistical mechanics (and hence in thermodynamics, which is derived from statistical mechanics), entropy is the expected number of decisions that would be needed to pin down a particular energy eigenstate, given the distribution of energy in the given mixed equilibrium state. This has nothing to do with the problem of whether the given state is or isn't known. See also http://www.physicsforums.com/showthread.php?t=465492

The entropy of the ensemble of ''casting a die N times''is N(log 6)/6 - whereas orderliness would be assigned to some particular sequences (for example those sorted by number of points), not to the ensemble.
To quantify orderliness, one needs a concept different from entropy, the Kolmogorov complexity
http://en.wikipedia.org/wiki/Chaitin_Complexity
 P: 6 the concept of entropy is just the opposite of gravity
PF Patron
P: 10,397
 Quote by urmother the concept of entropy is just the opposite of gravity
Either elaborate or provide a reference for this please.
 P: 73 According to them. Apart from the usual thermal entropy as interpreted by Boltzman. We have conditional entropy of cosmological event horizon and entropy associated with inhomogeniety of gravitational fields (Weyl curvature hypothesis) bounded by generalized summation that entropies in the universe cannot decrease. http://www.mdpi.com/1099-4300/14/12/2456 Some say "entropy is an embedded characteristic of energy which creates a dimensional conservation domain for its energy type. Intrinsic motion: light creating space: motion creating time: intrinsic motion gravitation as the conversion force between the two primordial entropy drives and domains, creating spacetime." http://www.johnagowan.org/tetra7.html ..Or simply breaking of symmetry.. http://www.acadeuro.org/fileadmin/us...nt/Mainzer.pdf In more technical terms, entropy is a specific value that measures how much energy is released in a system when it settles into the lowest potential energy. Entropy assesses the amount of disorder, understood as a change in energy, from an earlier point to a later point in time. This must happen in a "closed" system, where no energy leaks in or out". Theoretically possible but hard to achieve. It seems that anything animated in general are bound by entropy. *Please do correct me if i got the wrong idea.
P: 6
 Quote by Drakkith Either elaborate or provide a reference for this please.

A recurring theme throughout the life of the universe is the continual struggle between the force of gravity and
the tendency for physical systems to evolve toward more disorganized conditions. The amount of disorder in a
physical system is measured by its entropy content. In the broadest sense, gravity tends to pull things together and
thereby organizes physical structures. Entropy production works in the opposite direction and acts to make physical
systems more disorganized and spread out. The interplay between these two competing tendencies provides much
of the drama in astrophysics

Life, gravity and the second law of thermodynamics
PF Patron
P: 10,397
 Quote by urmother that was my own opinion ,since you asked for a link i googled and find this A recurring theme throughout the life of the universe is the continual struggle between the force of gravity and the tendency for physical systems to evolve toward more disorganized conditions. The amount of disorder in a physical system is measured by its entropy content. In the broadest sense, gravity tends to pull things together and thereby organizes physical structures. Entropy production works in the opposite direction and acts to make physical systems more disorganized and spread out. The interplay between these two competing tendencies provides much of the drama in astrophysics Life, gravity and the second law of thermodynamics

I get what they are saying, I just don't feel it's very accurate. I think saying gravity is the opposite of entropy is comparing two completely unlike things. One is a force and the other is a measure of disorder, energy dispersal in a system, or whatever definition you are using for entropy. I may be splitting hairs here, but "entropy production" is the result of the interaction of light and matter through the fundamental forces, while entropy itself is just a measurement of the disorder itself, and while gravity may tend to try to organize things in the short term, it ends up causing entropy to increase in the long run. I think at least. My knowledge of entropy isn't the best.

I hope that makes sense. If my understanding of entropy is incorrect someone let me know.
P: 166
 Quote by Drakkith I get what they are saying, I just don't feel it's very accurate. I think saying gravity is the opposite of entropy is comparing two completely unlike things. One is a force and the other is a measure of disorder, energy dispersal in a system, or whatever definition you are using for entropy. I may be splitting hairs here, but "entropy production" is the result of the interaction of light and matter through the fundamental forces, while entropy itself is just a measurement of the disorder itself, and while gravity may tend to try to organize things in the short term, it ends up causing entropy to increase in the long run. I think at least. My knowledge of entropy isn't the best. I hope that makes sense. If my understanding of entropy is incorrect someone let me know.
I don't think it's a bad way to look at things. If you look at it by your definition, a planet in the emptiness of space is a lot more ordered than all those atoms scattered across that same space. Gravity is the force giving order to that system. Eventually entropy will win out even on this scale, it just takes a long time.
PF Patron
P: 10,397
 Quote by justsomeguy I don't think it's a bad way to look at things. If you look at it by your definition, a planet in the emptiness of space is a lot more ordered than all those atoms scattered across that same space. Gravity is the force giving order to that system. Eventually entropy will win out even on this scale, it just takes a long time.
Hmmm. I though that in certain circumstances all those atoms scattered about have LESS entropy than the planet.
 PF Patron Sci Advisor P: 2,917 to me it's just navigation markers through the steam tables. Though it is curious - if as my old thermo book said, entropy of universe is increasing ; what then is decreasing in exchange? The amount of stuff remaining unknown? old jim

 Related Discussions Cosmology 17 Cosmology 22 Advanced Physics Homework 4 Introductory Physics Homework 5 Beyond the Standard Model 3