Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is entropy?

  1. Oct 1, 2007 #1
    I tried looking in the Wikipedia, but didn't help. My main question actually is...what keeps the universe dynamic? I mean why can't it be in a fixed position? And if it will be, what form would it be in? Or would it be in a infinitely small point?

    Thx.
     
  2. jcsd
  3. Oct 1, 2007 #2

    malawi_glenn

    User Avatar
    Science Advisor
    Homework Helper

    Entropy is a measure about disorder. You can find the definitions and expressions if you google a bit; it is fundamental classical statistical mechanics.

    We dont know yet what universe will look like in say 10^40 years.
     
  4. Oct 2, 2007 #3
    Ok...then lets go back in time...why is the entropy increasing? Shouldn't the entropy be 0? I mean since all there is, is energy. What keeps it moving?
     
  5. Oct 2, 2007 #4

    malawi_glenn

    User Avatar
    Science Advisor
    Homework Helper


    The initial conditions of big bang perhaps. have you studied Cosmology? I can recomend good book that is used at universities.
     
  6. Oct 2, 2007 #5

    JesseM

    User Avatar
    Science Advisor

    Essentially because the universe seems to have started in a very ordered state and dynamical changes are likely to increase disorder, in the same way that randomly shuffling a deck of cards will tend to increase the cards' disorder if they started in an orderly arrangement. If you want to get a better understanding of entropy and the 2nd law of thermodynamics, I suggest reading these sites (both by the same author):

    http://www.secondlaw.com/
    http://www.2ndlaw.com/
     
  7. Oct 15, 2007 #6
    Yes please...
     
  8. Oct 15, 2007 #7

    malawi_glenn

    User Avatar
    Science Advisor
    Homework Helper

    Introduction to Cosmology by Barbara Ryden
     
  9. Oct 15, 2007 #8
    Yes Jesse is correct. Entropy of the universe is increasing because big bang was very ordered stage of the universe. You have to remember that increase in the entropy is the fundamemtal property of the universe just as electron is negatively charged. You can't say why it is so because if it was not so then we won't be here to ask these questions as explained by Sir Marteen Rees. We have to ask this to creator why the property of the matter is so
     
  10. Oct 16, 2007 #9
    Entropy is the measure of disorder, but what's really happening is that energy is ordering themselves. It can be released by heat from our bodies or everytime you fix or rearrange things in order. We were talking about this in Bio and one of the kids tried to use this as an excuse not to clean there room, it worked the first time (>_<"), but after that the parent just don't care
     
  11. Oct 16, 2007 #10
    Hmm sounds interesting,energy ordering itself. But can you explain me why or say the site where I can read about it. By the does biology also includes the term entropy. I havent study Bio except for one year in my school
     
  12. Oct 16, 2007 #11

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    There is a clear movement now to not equate entropy with "disorder". While disorder CAN be the result of entropy, such one-to-one connection can lead to some severe misunderstanding of what entropy really is.

    One may want to review the materials available on Frank Lambert's website, who incidentally has taken the task of trying to remove the automatic connection between entropy and disorder. This site is also useful for students learning about thermodynamics because various confusing concepts (what he mentioned as all those "delta's" is a prime example) are given more physical and conceptually-easier picture to understand.

    Zz.
     
  13. Oct 16, 2007 #12
    Jesse won't it be possible that the universe began in a disordered state and end with highly ordered state? I got this concept today after reading "A Brief History of Time".
     
  14. Oct 16, 2007 #13
    Consider what would have to occur for this to be the case. you're looking at a chain of statistically unlikely events all happening at once - which magnifies the 'improbability' vastly to the point that it's all but impossible.

    Looking at it another way, consider the common-language truism that for one entity of matter (particle, molecule, tennis ball, whatever) to change its state it must generally be affected in some way by another entity. Such a change generally requires the transfer of energy from one to another. Suppose you then wanted to reverse the change. At the very least, you'd have to take the energy out of the final state and put it back in the initial objects in the same form that it started in - but this itself is an act much the same as the one we were talking about in the first place, and you'd need to transfer some energy from a third entity in order to restore the first two. In essence, you can't put energy back in its box, and that's why entropy increases and doesn't, overall, decrease. Note that there's nothing to say that locally entropy can't decrease, provided it's balanced with an increase somewhere else so that the net change summed over all space is equal to or greater than zero.
     
  15. Oct 16, 2007 #14
    The fine grained entropy is indeed equal to zero and will always remain zero. I.m.o., when we consider the universe as a whole it doesn't make much sense to talk about the usual definition of the entropy, which is a coarse grained entropy. The fact that this enytropy does not decrease is basically a triviality.

    The laws of physics are (as far as we know) such that information is exactly conserved as a function of time. Now, in practice, when we want to describe a system containing a huge number of particles, we are only intersted in a few variables, like the pressure, the energy content etc. These few variables cannot, of course, uniquely define the exact physical state of the system. So, given these variables, there will be a huge number of states the system can be in. The entropy is the logarithm of this number.

    Now, where does this "fine grained" thing come in? A system in a finite volume can be in certain energy levels. If you could specify the energy a system is in to sufficient accuracy, then you could actually exactly define the state the system is in. So, then there would only be one state the system can be in (assuming that the energy levels are not degenerate) given the specification of the energy and the entropy would thus be zero.

    The coarse grained definition of the entropy is as follows. You simply fix some small energy range [tex]\delta E[/tex] as your energy uncertainty. This energy range is supposed to be small on a macroscopic scale. You then count how many microstates have an energy in that range. The entropy is the logarithm of that number. For a system containing many particles, the dependence on [tex]\delta E[/tex] can be ignored in a relative sense.

    You can also look at this from an information theoretical perspective. You can define the entropy of a system as the number of bytes you would need to fully specify the system given the information you already have about the system. The distinction between fine grained and coarse grained entropy is then easy to understand. If you were to specify the energy of a system so precisely that it precisely fixes the state the system is in, then all of the bytes needed to specify the system are in the energy specification and no extra bytes would be needed to specify the system exactly.

    Suppose, on the other hand, that the energy specification is of finite accuracy and containes only a few bytes of information. Since the number of bytes needed to specify the system is huge, we can ignore the few bytes already contained in the energy specification.

    Entropy increases because all the states of a physical system are intrinsically equally likely, just like when throwing a die all the outcomes from 1 to 6 are equally likely. Suppose you have N dice and you measure its macroscopic state by adding up all the numbers the dice are showing. If the intital state is N, then that means that all the dice are showing a 1. The macrostate defines a single state so the entropy is zero.

    But if we now throw all the dice, then the macrostate will change, it will become 3.5 N, because for this macrostate there are the most number of microstates possbile, all of which are equally likely. The entropy has thus increased, it is now the logarithm of the number of microstates compatible with a value of 3.5 N for the macrostate. There is nothing mysterious about the reason why this increase happened, other than why the initial state had such a low entropy.

    If the entropy of the universe were maximal, then there could be no life in the universe, so given that we exist, the entropy cannot be maximal. So, I don't think that the low initial entropy of the universe is such a strange fact. There are perhaps other issues, like why we expereince an arrow of time that points in the direction in which the entropy increases. This has to do with the fact a computer in a universe can only be run in the direction of increasing entropy.
     
  16. Oct 16, 2007 #15
    I'm taking Ap Bio right now, and it's the beginning of the year so my understanding of it is limited.
    Biologically, it is said that if the population continues to increase, Entropy increases, because our cells are organized, which helps contribute to entropy.I can't find this anywhere on the internet but it's said in my text book. Biologically the organization is happening locally. According to my text book "Every energy transfer or transformation increases the entropy of the universe." It also explains that entropy is the physical disintegration of a structures. When we eat we are breaking down complex, organized matter. Using chemistry, the chemical energy from the food is broken down to kinetic energy which our bodies uses in the form of heat.Because our bodies is not a perfect engine along with any other piece of machinery, our bodies can not completely utilize heat efficiently, therefore the heat is released which increases the entropy of the universe. When we fill up are cars with gas only 10 or $15 of our 50 dollars use to fill the tank is used to run the car. The other $35 is only use to keep the engine warm, which is then released as heat.

    Carnot came up with the idea of the Carnot engine which, theoretically is the perfect engine because it can achieve the max efficiency of heat without releasing entropy as a result. but it's almost impossible to achieve this kind of result.
     
  17. Oct 16, 2007 #16
    Yes man I know that perfectly. But entropy of gas and radiation is always higher, isn't it? But this how we belive the universe was begun according to big bang model. So won't it be that the entropy of the earlier universe was higher than today? By the way can you explain me the thermodynamic arrow and cosmological arrow of time poits towards same direction?
     
  18. Oct 16, 2007 #17
    According to the Big bang theory, everything was released at once with a huge explosion. When everything was spilled out, energy was released randomly. But as time progressed structures started to take up more complex structures and became organize, which increase entropy. Entropy is a measure of disorder not disorder itself so you can't say that there was more entropy in the beginning just because it was more disorganize.
    I'm not to sure what you mean by Thermodynamic and cosmological arrows? What is that?
     
  19. Oct 17, 2007 #18
    I mean the arrow of time. As we all know that the arrow of therodynamic time always points in same direction as cosmological arrow of time. I am asking why is so? Is it necessary to be like this or any other form is possible?
     
  20. Oct 19, 2007 #19
    Im not sure I haven't gotten that far, wait for me in like 7 years after I finish college than I'll answer your question (>_<")
     
  21. Oct 19, 2007 #20
    By the way in which level are you studying and what is your major subject? Are you from USA? I will give the answer to this question later on on the basis of my knowledge as I am in hurry today.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: What is entropy?
  1. What exactly is entropy? (Replies: 42)

  2. What is entropy (Replies: 8)

Loading...