
#1
Mar812, 09:52 AM

P: 30

Why do we constantly hear about it? can it be explained simply?




#2
Mar812, 10:23 AM

P: 1,262




#4
Mar1612, 02:12 PM

P: 833

Entropy. What is it?
Entropy is not a property of a system in a given quantum state, but rather is a property of a system whose exact quantum state is unknown. It is a measure of how many quantum states look similar to system at hand (e.g. how many quantum states are consistent with certain observational or theoretical constraints).
From a more experimental point of view: in the real world, you never know everything going on in an experimentyou can only measure some finite number of general properties. As time goes along, after you stopped the measurment, things get mixed up within the system and with the environment, and you lose details about the experiment, so the entropy of the system goes up. As things mix up, the system approaches what is called thermal equilibrium, where the system is describable by a single temperature. Usually, it is the heat energy that is mixed up first (hot things cool off, cold things warm up), so the field of study is known as thermodynamics. But if you wait an infinite amount of time, eventually everything decays, all substances break down and reform out of contituent parts, repeatedly, until every property of the system reaches thermal equilibrium (where thermal refers not only to particle motion but also to particle composition and other properties) and you get a frothy mix of "stuff" in constant remixing. This is the heat death of the universe. 



#5
Mar1912, 08:02 PM

P: 65

if you had children, you would know...




#6
Mar2012, 07:23 AM

P: 30

So ENTROPY means "things getting messy in every little conceivable detail"...??? Or....."No matter where you hind them, your stuff will be found by "entropy" & will be messed with..?" 



#7
Mar2112, 08:45 AM

PF Gold
P: 3,075

The concept behind entropy is simple you lump many different states of a system into groups that you don't consider different, like a "messy room." There are many more states of a room that get called "messy" than get called "neat," so we say a messy room has a "higher entropy." If the room state is more or less random, it is vastly more likely to be in a state we'd call "messy", and that's why we have the second law of thermodynamics, it's just probability.
What's interesting is that a messy room actually contains more information, because you need explicit instructions about where every single object is. But if you have an alphabetized file cabinet, you get the idea quickly, and can find each thing without needing specific information each time. 



#8
Mar2212, 10:15 PM

HW Helper
PF Gold
P: 1,848

Here is a basic concept of entropy explained:
http://hyperphysics.phyastr.gsu.edu...rm/entrop.html which sort of goes along with this link http://hyperphysics.phyastr.gsu.edu...ntrop2.html#c1 That concept of what entropy is will work for most things in chemistry and thermodynamics: An ordered bunch of gas molecules all squashed up in the corner of the room will naturally expand to fill the room, thus increasing the overall entropy. There's more configurations involving the whole room, than just a small part of it. So the spreadout version has more entropy. But one thing that is not so intuitive is how gravity effects entropy. Gravity causes things (even gas) to naturally clump together. Does gravity cause entropy to naturally, spontaneously decrease? Does gravity violate the second law of thermodynamics? No! and No! Gravity works backwards in this respect of entropy vs. clumping. I've always found it awkward trying to explain why though without pulling out equations and invoking negative energy. Until tonight, that is. I just stumbled upon this a few minutes ago, which inspired me to reply to this thread (the webpage has equations in it, but even if you ignore them the general concept is presented nicely without really needing the equations): http://www.mathpages.com/home/kmath573/kmath573.htm On a final note (after reading the above link), one might argue about dark matter. Dark matter particles pass right through each other and everything else. One might conclude then that entropy is not increased by dark matter collapse and galaxy cluster formation. The saving concept is that dark matter density fluctuations produce gravitational waves. Because of that, the dark matter density will settle down in a relatively clumpy equilibrium (not as clumpy as normal matter, but more like halos) as energy is released in the form of gravitational waves. This process is called virialization. So entropy increases from that too. I love the quote at the beginning of the above link:




#9
Mar2312, 12:40 AM

Sci Advisor
PF Gold
P: 9,185

All I know is if I don't clean my car out for a weak, it takes a week to clean it back up  and then I have to start all over again.




#10
Dec912, 03:50 AM

Sci Advisor
PF Gold
P: 1,942

For example, consider English text. The entropy is a characteristic property of the written language, not of a single string. Therefore, each string of N characters of an English text has [if this word can be used here at all meaningfully] the same entropy (namely that of a random string of N characters of an English text), no matter how ordered or unordered it is. In general, entropy is a characteristic property of a probability distribution, not of any of its realizations. In information theory, entropy is a measure of lack of information. It is the expected number of decisions that would be needed to pin down a particular realization of something, when the probability distribution of all possible realizations is given. In quantum statistical mechanics (and hence in thermodynamics, which is derived from statistical mechanics), entropy is the expected number of decisions that would be needed to pin down a particular energy eigenstate, given the distribution of energy in the given mixed equilibrium state. This has nothing to do with the problem of whether the given state is or isn't known. See also http://www.physicsforums.com/showthread.php?t=465492 The entropy of the ensemble of ''casting a die N times''is N(log 6)/6  whereas orderliness would be assigned to some particular sequences (for example those sorted by number of points), not to the ensemble. To quantify orderliness, one needs a concept different from entropy, the Kolmogorov complexity http://en.wikipedia.org/wiki/Chaitin_Complexity 



#11
Dec1212, 02:15 PM

P: 6

the concept of entropy is just the opposite of gravity




#13
Dec1312, 01:49 AM

P: 123

According to them. Apart from the usual thermal entropy as interpreted by Boltzman. We have conditional entropy of cosmological event horizon and entropy associated with inhomogeniety of gravitational fields (Weyl curvature hypothesis) bounded by generalized summation that entropies in the universe cannot decrease.
http://www.mdpi.com/10994300/14/12/2456 Some say "entropy is an embedded characteristic of energy which creates a dimensional conservation domain for its energy type. Intrinsic motion: light creating space: motion creating time: intrinsic motion gravitation as the conversion force between the two primordial entropy drives and domains, creating spacetime." http://www.johnagowan.org/tetra7.html ..Or simply breaking of symmetry.. http://www.acadeuro.org/fileadmin/us...nt/Mainzer.pdf In more technical terms, entropy is a specific value that measures how much energy is released in a system when it settles into the lowest potential energy. Entropy assesses the amount of disorder, understood as a change in energy, from an earlier point to a later point in time. This must happen in a "closed" system, where no energy leaks in or out". Theoretically possible but hard to achieve. It seems that anything animated in general are bound by entropy. *Please do correct me if i got the wrong idea. 



#14
Dec1312, 06:32 AM

P: 6

that was my own opinion ,since you asked for a link i googled and find this ^{ A recurring theme throughout the life of the universe is the continual struggle between the force of gravity and the tendency for physical systems to evolve toward more disorganized conditions. The amount of disorder in a physical system is measured by its entropy content. In the broadest sense, gravity tends to pull things together and thereby organizes physical structures. Entropy production works in the opposite direction and acts to make physical systems more disorganized and spread out. The interplay between these two competing tendencies provides much of the drama in astrophysics } Life, gravity and the second law of thermodynamics 



#15
Dec1312, 03:17 PM

PF Gold
P: 11,057

I get what they are saying, I just don't feel it's very accurate. I think saying gravity is the opposite of entropy is comparing two completely unlike things. One is a force and the other is a measure of disorder, energy dispersal in a system, or whatever definition you are using for entropy. I may be splitting hairs here, but "entropy production" is the result of the interaction of light and matter through the fundamental forces, while entropy itself is just a measurement of the disorder itself, and while gravity may tend to try to organize things in the short term, it ends up causing entropy to increase in the long run. I think at least. My knowledge of entropy isn't the best. I hope that makes sense. If my understanding of entropy is incorrect someone let me know. 



#16
Dec1312, 05:41 PM

P: 166





#17
Dec1312, 06:15 PM

PF Gold
P: 11,057




Register to reply 
Related Discussions  
Physical interpretation of entropy if expansion causes a state of maximum entropy?  Cosmology  17  
The entropy of the universe? (attempts to define gravitational entropy)  Cosmology  22  
Show how the Boltzmann entropy is derived from the Gibbs entropy for equilibrium  Advanced Physics Homework  4  
How is the entropy of the universe increasing when entropy is simply transferred?  Introductory Physics Homework  5  
String entropy and black hole entropy  Beyond the Standard Model  3 