Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Introduced to the concept of entropy in school

  1. Feb 27, 2005 #1
    Last we were introduced to the concept of entropy in school... I was quite suspicious to it already at first sight..

    It is something like that in a closed system, chaos always increases....
    meaningen that universe is heading towards chaos... (i.e. universe is heading towards where materiai will be distributed allover the universe(and not in bunches as planets and stars))


    but what about the creation of our solar system...?
    dont they believe that it was like a cloud of gases and stuff, and then bunched together to many "bodies of materia" that collided which other and ended up in 9 big "bodies" rotating arounf the big sun...?

    that goes against the laws of entropy, doesnt it?
     
  2. jcsd
  3. Feb 27, 2005 #2

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    It is not really correct or helpful to think of increasing entropy as increasing chaos or disorder. Entropy is a thermodynamic concept that started out as a simple observation: "heat cannot flow from a cold reservoir to a hot reservoir without the addition of work" or "heat from a reservoir cannot be completely converted to work - some of the heat must always flow to a reservoir at a colder temperature". The change in entropy is a measure of the amount of work required to make heat flow from a hot to a cold reservoir, or a measure of the amount of work that can be extracted from heat flowing from a hot reservoir to a reservoir at lower temperature.

    Attempts were made in the 19th century to explain entropy in terms of statistics, but they have created much confusion and difficulty. As a result, entropy is a difficult and frequently misunderstood concept.

    AM
     
  4. Feb 27, 2005 #3
    really?

    my teacjer said that entropy could be described as "disorder or chaos" and the 2nd law of entropy is 'change in entropy is greater then zero.

    Meaning that entropy (chaos) always increases...
    Straight from my notes:
    "Locally entropy can decrease, as long as entropy increases at least as much somewhere else"

    the heat stuff you mentioned is one thing wee entropy can be applied, my teacher mentioned that too... but still that is also just an example where "chaos" increases....

    to rephrase my questions:

    Does the laws of entropy say that chaos always increases?
    If yes, how do you explian my example of the solar system.....
     
  5. Feb 27, 2005 #4

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    Really.
    No. The law of increasing disorder does not apply to gravitational collapse, for example.
    You cannot use the law of entropy to explain examples of highly ordered and disordered systems. You have to know the history of the 'system' ie how the system originated.

    AM
     
  6. Feb 27, 2005 #5

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Entropy was introduced in equilibrium thermodynamics,true.However,once with the works of Boltzmann & Gibbs,it became clear that the nature of entropy is not thermodynamical at all,but statistical.Actually,the laws of empirical thermodynmics (CTPCN fromulation) are found elegantly by the mean of statistical physics.

    There are 2 definitions of entropy:
    [tex] S_{class}=:-k \langle \ln \rho \rangle [/tex]

    [tex] S_{quant}=:-k \langle \ln \hat{\rho} \rangle [/tex]

    Anything rlse is just a bunch of logical consequences.

    Daniel.
     
    Last edited: Feb 27, 2005
  7. Feb 27, 2005 #6

    What is [tex] \rho [/tex] in this case? I am aware of the [tex]S = -k_B ln T[/tex] relationship from thermodynamics, but have not seen this.

    Thanks.
     
  8. Feb 27, 2005 #7

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Nope.It can't be the formula you pictured there.In what case is it valid (assuming by absurd that it would be correct).

    [tex]\rho[/tex] is the probability density on the statistical ensemble.And [tex]\hat{\rho}[/tex] is the density operator.

    Daniel.
     
  9. Feb 27, 2005 #8
    Oh yeah, I had a sudden memory lapse, I meant to write:

    [tex] S = k_B \ln W [/tex]

    Where W is number of states.
    So that makes more sense now.

    I must have been semi-remembering a Free Energy formula or something.
    Cheers for that.
     
  10. Feb 27, 2005 #9

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Nope,that W is not the # of states.You may check a book on SM for its name and its significance.Anyway,you have tried to depict Boltzmann's formula.In the axiomatical formulation of equilibrium SM,it's nothing but a consequence.

    Daniel.
     
  11. Feb 28, 2005 #10
    I've never been comfortable with the concept of "entrophy". It sounds too much like the energy is going somewhere, but nobody knows where.
     
  12. Feb 28, 2005 #11

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    It may be because you were taught the Boltzman concept of entropy rather than the simpler Clausius concept first. Entropy is not about disorder. It is about reversibility of thermodynamic processes. There is no principle that the disorder always increases. Analogies to Humpty-Dumpty are quite misleading.

    AM
     
  13. Feb 28, 2005 #12

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper

    Does that mean that the work done by Gibbs,von Neumann and C.Shannon is wrong...?

    :bugeye:

    Daniel.
     
  14. Feb 28, 2005 #13

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    You have received a bunch of very good answers here, which really tried to convey that fact that the term "entropy" has a deeper meaning than just "chaos" or "disorder". BTW, be VERY careful in using the word "chaos", because in physics and mathematics, chaos is NOT equal to disorder. There's a definite meaning to the term chaos that should not be confused with the pedestrian usage of that word.

    Now, having said that, I will attempt at answering your question using the level of understanding that you have been given. In other words, I will try to show you why the formation of the solar system, etc., does not violate any thermodynamics laws, even the assumption that "entropy" is disorder, which is the way you understand it.

    Let's assume that your idea is correct, that the formation of planets and solar system is a reduction in entropy of the planets+sun system (pay careful attention to what the whole "system" in question is here). Now, the 2nd Law of Thermodynamics clearly indicates that the net entropy can only either increase, or remains constant. However, this only applies (at least in the conventional system) to a closed system with no interaction from outside the system. Our solar system has two different "external sources" of energy:

    1. The energy and fields from other nearby systems
    2. The gravitational potential energy that is not included in the statistics of randomly moving particles [i.e. the standard thermodynamics principle ignores particle-particle interactions other than elastic collisions between them]

    These two factors causes the planets+sun system to be NOT an insolated, adiabatic, closed system, but rather an open one. There is no reason to expect that the entropy of that system cannot decrease, the very same way that the entropy of one part of a Carnot cycle actually decreases.

    So, even if we apply what you understood as "entropy", and use your assumption that formation of planets and sun is a reduction in disorder, there is still nothing here that violates any physics principles.

    Zz.
     
  15. Feb 28, 2005 #14

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    No. Information theory and Statistical theory have distinct concepts of entropy. They have really very little usefulness in thermodynamics. They do nothing to illustrate the concept of thermodynamic entropy and do everything to confuse it.

    The problem is that students struggle to acquire the thermodynamic concept but instead are thrown statistical concepts that even the teachers don't fully understand. Gibbs' statistical explanation is incomprehensible and does nothing to deepen one's understanding. It is far better to keep information theory and thermodynamics separate, for purely pedagogical reasons if not for scientific reasons.

    As far as disorder is concerned, it depends on how one defines order. Humpty Dumpty is not more ordered than a particular arrangement of smashed HD pieces. It is just that there are many more ways to smash an egg and much fewer ways to make it resemble an egg. But inside the chicken, Humpties are being made all the time and it has nothing really to do with entropy.

    AM
     
  16. Feb 28, 2005 #15

    Janitor

    User Avatar
    Science Advisor

    Just to add a bit to what ZapperZ has already said, remember that as particles fall down a gravitational well to smash into and stick onto the surface of a growing planet or moon, there will be heat generated in the collision. This heat will create electromagnetic energy (much of it in the infrared, I reckon) which for the most part escapes from our solar system. So to do your energy and entropy accounting with a closed system, you would have to have your system expanding outward at the speed of light, starting four billion or so years ago when the solar disk first appeared on the scene. By today, that has become a mighty large system! In fact, it covers a substantial fraction of the visible universe.
     
  17. Mar 1, 2005 #16

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    I'd like to point out that a similar argument has been put forth by creationists in their attempt to discredit evolution. They argued that since "life" is essentially order out of disorder in the evolutionary picture, then the evolution of life on earth violates the 2nd Law of Thermodynamics. If you don't believe me, do a google on "evolution thermodynamics".

    I have explained in one of my journal entry why this is nothing more than ignorance of basic physics. The explanation is similar to the one that I've given here in this thread. Such foolishness is simply ONE of the many examples where people who only learn bits and pieces of something but nevertheless assume that they have understood all there is to know. As I've said many times, imagination without knowledge is ignorance waiting to happen.

    Zz.
     
  18. Mar 1, 2005 #17
    As my chemistry teacher once put it:
    Entropy is disorder.
    High entropy means high probability of things to happen :approve:

    Like for instance yours truly tripping over something in his incredibly messy study and end up in the hospital with a broken ankle... :rolleyes:
     
  19. Mar 1, 2005 #18

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    This would require defining 'disorder' in a statistical sense - a measure related to the variation (eg. width of the standard deviation) of speeds or energies of molecules in the warm reservoirs compared to that of the hot and cold reservoirs. That is a very specific concept and it is related to thermodynamic entropy. But you cannot then replace that very specific statistical concept of 'disorder' (ie. populations of molecules with greater variation in speeds having more disorder) with a general non-statistical concept of 'disorder' and say that the general concept is also related to thermodynamic entropy. It isn't. Thermodynamic entropy is related to disorder only if you define disorder in the specific statistical sense.

    The principles behind information theory (also statistically based) which deal with information loss are again different. Similar statistical analyses are used, but to suggest that 'bandwidth' in the information sense, explains thermodynamic entropy is a big stretch and, in any event, is not very instructive or helpful to explain the thermodynamic second 'law'. The information theory equivalent of the 'second law' is not really a law at all. It can, and is often, broken all the time. If it wasn't, the internet would not function.

    So entropy is not related to disorder in the general sense such as your broken ankle (which sounds like a true story, so you have my sympathies if it is). To start a discussion of entropy and the second law of thermodynamics with a general discussion of order and disorder simply confuses the student and is most often quite incorrect.

    AM
     
  20. Mar 3, 2005 #19

    Janitor

    User Avatar
    Science Advisor

  21. Mar 3, 2005 #20

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    I think that this has always been a problem faced by educators, especially at the elementary level - when teaching something new, what exactly do you tell the students what it is, so that they have some idea. Most educators, understandably, would simply describe entropy as disorder. Those of us who are lucky enough to further our knowledge in this field of study later on realize that such a definition is, at best, inadequate and incomplete. However, there are many people walking around who did not have the chance to study this further and therefore, are stuck with the understanding that "entropy=disorder".

    For people who do fall in that category, if there's anything you can get out of this thread, is that you need to at least be aware that in physics, the definition that "entropy=disorder" should not be adhered to as being a clear meaning of what entropy is. Several links have been given above on why this isn't true. I will give one more, which I strongly recommend people to read (it has stuff from the elementary to advance level):

    http://www.entropysite.com/

    Hopefully, this will start to eradicate, a little bit, of such misunderstanding.

    Zz.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Introduced to the concept of entropy in school
  1. Concept of entropy. (Replies: 4)

  2. THe Concept of Entropy (Replies: 1)

  3. Entropy concept (Replies: 2)

Loading...