Entropy: Definition, Explanation & Meaning

  • Thread starter Homesick345
  • Start date
  • Tags
    Entropy
In summary, entropy is a measure of how many quantum states look similar to a system at hand. It is a property of a system whose exact quantum state is unknown, and it is related to heat energy and the spread-out version of gravity.
  • #1
Homesick345
30
0
Why do we constantly hear about it? can it be explained simply?
 
Space news on Phys.org
  • #3
This is probably better off in the Classical Physics forum.
 
  • #4
Entropy is not a property of a system in a given quantum state, but rather is a property of a system whose exact quantum state is unknown. It is a measure of how many quantum states look similar to system at hand (e.g. how many quantum states are consistent with certain observational or theoretical constraints).

From a more experimental point of view: in the real world, you never know everything going on in an experiment--you can only measure some finite number of general properties. As time goes along, after you stopped the measurment, things get mixed up within the system and with the environment, and you lose details about the experiment, so the entropy of the system goes up. As things mix up, the system approaches what is called thermal equilibrium, where the system is describable by a single temperature.

Usually, it is the heat energy that is mixed up first (hot things cool off, cold things warm up), so the field of study is known as thermodynamics. But if you wait an infinite amount of time, eventually everything decays, all substances break down and reform out of contituent parts, repeatedly, until every property of the system reaches thermal equilibrium (where thermal refers not only to particle motion but also to particle composition and other properties) and you get a frothy mix of "stuff" in constant remixing. This is the heat death of the universe.
 
  • #5
if you had children, you would know... :smile:
 
  • #6
sahmgeek said:
if you had children, you would know... :smile:

Oh I got 2 little boys.

So ENTROPY means "things getting messy in every little conceivable detail"...?

Or..."No matter where you hind them, your stuff will be found by "entropy" & will be messed with..?"
 
  • #7
The concept behind entropy is simple-- you lump many different states of a system into groups that you don't consider different, like a "messy room." There are many more states of a room that get called "messy" than get called "neat," so we say a messy room has a "higher entropy." If the room state is more or less random, it is vastly more likely to be in a state we'd call "messy", and that's why we have the second law of thermodynamics, it's just probability.

What's interesting is that a messy room actually contains more information, because you need explicit instructions about where every single object is. But if you have an alphabetized file cabinet, you get the idea quickly, and can find each thing without needing specific information each time.
 
  • #8
Here is a basic concept of entropy explained:

http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html

which sort of goes along with this link

http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop2.html#c1

That concept of what entropy is will work for most things in chemistry and thermodynamics: An ordered bunch of gas molecules all squashed up in the corner of the room will naturally expand to fill the room, thus increasing the overall entropy. There's more configurations involving the whole room, than just a small part of it. So the spread-out version has more entropy.

But one thing that is not so intuitive is how gravity effects entropy. Gravity causes things (even gas) to naturally clump together. Does gravity cause entropy to naturally, spontaneously decrease? Does gravity violate the second law of thermodynamics? No! and No! Gravity works backwards in this respect of entropy vs. clumping. I've always found it awkward trying to explain why though without pulling out equations and invoking negative energy. Until tonight, that is. I just stumbled upon this a few minutes ago, which inspired me to reply to this thread (the webpage has equations in it, but even if you ignore them the general concept is presented nicely without really needing the equations):

http://www.mathpages.com/home/kmath573/kmath573.htm

On a final note (after reading the above link), one might argue about dark matter. Dark matter particles pass right through each other and everything else. One might conclude then that entropy is not increased by dark matter collapse and galaxy cluster formation. The saving concept is that dark matter density fluctuations produce gravitational waves. Because of that, the dark matter density will settle down in a relatively clumpy equilibrium (not as clumpy as normal matter, but more like halos) as energy is released in the form of gravitational waves. This process is called virialization. So entropy increases from that too.

I love the quote at the beginning of the above link:

The second law of thermodynamics holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation, well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.
Arthur Eddington​
 
  • #9
All I know is if I don't clean my car out for a weak, it takes a week to clean it back up - and then I have to start all over again.
 
  • #10
Ken G said:
The concept behind entropy is simple-- you lump many different states of a system into groups that you don't consider different, like a "messy room." There are many more states of a room that get called "messy" than get called "neat," so we say a messy room has a "higher entropy." If the room state is more or less random, it is vastly more likely to be in a state we'd call "messy", and that's why we have the second law of thermodynamics, it's just probability.

What's interesting is that a messy room actually contains more information, because you need explicit instructions about where every single object is. But if you have an alphabetized file cabinet, you get the idea quickly, and can find each thing without needing specific information each time.

There is no simple relation between entropy and order, and the popular informal account that you describe is (as much of popular science) highly inaccurate.

For example, consider English text. The entropy is a characteristic property of the written language, not of a single string. Therefore, each string of N characters of an English text has [if this word can be used here at all meaningfully] the same entropy (namely that of a random string of N characters of an English text), no matter how ordered or unordered it is.

In general, entropy is a characteristic property of a probability distribution, not of any of its realizations.

In information theory, entropy is a measure of lack of information. It is the expected number of decisions that would be needed to pin down a particular realization of something, when the probability distribution of all possible realizations is given.

In quantum statistical mechanics (and hence in thermodynamics, which is derived from statistical mechanics), entropy is the expected number of decisions that would be needed to pin down a particular energy eigenstate, given the distribution of energy in the given mixed equilibrium state. This has nothing to do with the problem of whether the given state is or isn't known. See also https://www.physicsforums.com/showthread.php?t=465492

The entropy of the ensemble of ''casting a die N times''is N(log 6)/6 - whereas orderliness would be assigned to some particular sequences (for example those sorted by number of points), not to the ensemble.
To quantify orderliness, one needs a concept different from entropy, the Kolmogorov complexity
http://en.wikipedia.org/wiki/Chaitin_Complexity
 
Last edited:
  • #11
the concept of entropy is just the opposite of gravity
 
  • #12
urmother said:
the concept of entropy is just the opposite of gravity

Either elaborate or provide a reference for this please.
 
  • #13
According to them. Apart from the usual thermal entropy as interpreted by Boltzman. We have conditional entropy of cosmological event horizon and entropy associated with inhomogeniety of gravitational fields (Weyl curvature hypothesis) bounded by generalized summation that entropies in the universe cannot decrease.

http://www.mdpi.com/1099-4300/14/12/2456

Some say "entropy is an embedded characteristic of energy which creates a dimensional conservation domain for its energy type. Intrinsic motion: light creating space: motion creating time: intrinsic motion gravitation as the conversion force between the two primordial entropy drives and domains, creating spacetime."

http://www.johnagowan.org/tetra7.html

..Or simply breaking of symmetry..

http://www.acadeuro.org/fileadmin/user_upload/publications/ER_Symmetry_supplement/Mainzer.pdf

In more technical terms, entropy is a specific value that measures how much energy is released in a system when it settles into the lowest potential energy. Entropy assesses the amount of disorder, understood as a change in energy, from an earlier point to a later point in time. This must happen in a "closed" system, where no energy leaks in or out". Theoretically possible but hard to achieve.

It seems that anything animated in general are bound by entropy.

*Please do correct me if i got the wrong idea.
 
Last edited by a moderator:
  • #14
Drakkith said:
Either elaborate or provide a reference for this please.


that was my own opinion ,since you asked for a link i googled and find this


A recurring theme throughout the life of the universe is the continual struggle between the force of gravity and
the tendency for physical systems to evolve toward more disorganized conditions. The amount of disorder in a
physical system is measured by its entropy content. In the broadest sense, gravity tends to pull things together and
thereby organizes physical structures. Entropy production works in the opposite direction and acts to make physical
systems more disorganized and spread out. The interplay between these two competing tendencies provides much
of the drama in astrophysics


Life, gravity and the second law of thermodynamics
 
  • #15
urmother said:
that was my own opinion ,since you asked for a link i googled and find this


A recurring theme throughout the life of the universe is the continual struggle between the force of gravity and
the tendency for physical systems to evolve toward more disorganized conditions. The amount of disorder in a
physical system is measured by its entropy content. In the broadest sense, gravity tends to pull things together and
thereby organizes physical structures. Entropy production works in the opposite direction and acts to make physical
systems more disorganized and spread out. The interplay between these two competing tendencies provides much
of the drama in astrophysics


Life, gravity and the second law of thermodynamics
I get what they are saying, I just don't feel it's very accurate. I think saying gravity is the opposite of entropy is comparing two completely unlike things. One is a force and the other is a measure of disorder, energy dispersal in a system, or whatever definition you are using for entropy. I may be splitting hairs here, but "entropy production" is the result of the interaction of light and matter through the fundamental forces, while entropy itself is just a measurement of the disorder itself, and while gravity may tend to try to organize things in the short term, it ends up causing entropy to increase in the long run. I think at least. My knowledge of entropy isn't the best.

I hope that makes sense. If my understanding of entropy is incorrect someone let me know.
 
  • #16
Drakkith said:
I get what they are saying, I just don't feel it's very accurate. I think saying gravity is the opposite of entropy is comparing two completely unlike things. One is a force and the other is a measure of disorder, energy dispersal in a system, or whatever definition you are using for entropy. I may be splitting hairs here, but "entropy production" is the result of the interaction of light and matter through the fundamental forces, while entropy itself is just a measurement of the disorder itself, and while gravity may tend to try to organize things in the short term, it ends up causing entropy to increase in the long run. I think at least. My knowledge of entropy isn't the best.

I hope that makes sense. If my understanding of entropy is incorrect someone let me know.

I don't think it's a bad way to look at things. If you look at it by your definition, a planet in the emptiness of space is a lot more ordered than all those atoms scattered across that same space. Gravity is the force giving order to that system. Eventually entropy will win out even on this scale, it just takes a long time.
 
  • #17
justsomeguy said:
I don't think it's a bad way to look at things. If you look at it by your definition, a planet in the emptiness of space is a lot more ordered than all those atoms scattered across that same space. Gravity is the force giving order to that system. Eventually entropy will win out even on this scale, it just takes a long time.

Hmmm. I though that in certain circumstances all those atoms scattered about have LESS entropy than the planet.
 
  • #18
to me it's just navigation markers through the steam tables.

Though it is curious - if as my old thermo book said, entropy of universe is increasing ; what then is decreasing in exchange? The amount of stuff remaining unknown?

old jim
 
  • #19
In Penrose's book the Road to Reality, he sort of asserted without proof that when gravity pulls things together, this increases the entropy. It seems that if this is to be the case, gravity must somehow increase the degrees of freedom of the system, but this wasn't explained.
 
  • #20
In Penrose's book the Road to Reality

Page reference please?

In quantum statistical mechanics (and hence in thermodynamics, which is derived from statistical mechanics),

I think this claim will not stand up to detailed examination.

There is not complete overlap between the domains of applicability of thermodynamics and statistical mechanics. They both deal with areas not covered by the other.
 
  • #21
My understanding is that as gravity pulls disorganized masses together into objects like stars and planets, the entropy of that particular system decreases, but that this is offset by the rise in entropy over a wider scale / bigger system. As with most of these things, I suppose it depends on how wide you want to cast your net when defining the boundaries of the system.

A house of cards on its own is a low entropy arrangement, but the entropy of the system has increased overall if you include the energy used by the person building it.
 
  • #22
Drakkith said:
Either elaborate or provide a reference for this please.

I think the famous article by Eric Verlinde " on the origin of gravity and the laws of Newton "
 

FAQ: Entropy: Definition, Explanation & Meaning

What is entropy?

Entropy is a measure of the level of disorder or randomness in a system. It is a concept commonly used in thermodynamics and information theory.

How is entropy related to energy?

Entropy and energy are closely related. In thermodynamics, entropy is often described as a measure of unavailable energy in a system. As a system becomes more disordered, its entropy increases and its available energy decreases.

What is the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that systems naturally tend towards a state of maximum disorder or randomness.

Can entropy be reversed?

In isolated systems, entropy can never decrease. However, in open systems (those that can exchange matter or energy with their surroundings), it is possible for local decreases in entropy to occur.

How is entropy related to information?

In information theory, entropy is a measure of the uncertainty or randomness in a system. It is often used to quantify the amount of information that can be transmitted through a communication channel.

Similar threads

Replies
1
Views
1K
Replies
1
Views
1K
Replies
4
Views
1K
Replies
7
Views
2K
Replies
7
Views
2K
Replies
8
Views
2K
Back
Top