Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Isn't Entropy Relative?

  1. Jun 12, 2003 #1
    Hello everyone. This is my first post here.

    I have a question, after reading some of the posts here.

    Isn't entropy relative? What enables us, as observers, to state with certainty that something is tending toward "order" or "chaos"? What allows us to make that distinction?

    Couldn't, to another observer, one physical act seem to tend toward order, while it, to us, appears to tend toward chaos?

    If so, then what are the ramifications? :)
     
  2. jcsd
  3. Jun 12, 2003 #2

    russ_watters

    User Avatar

    Staff: Mentor

    You mean eye of the beholder? That looks ordered to me but chaotic to you? Entropy doesn't work that way. Entropy is a QUANTATATIVE measure of the level of disorder in a system. No room for opinion.

    We know because we measure it.
     
  4. Jun 12, 2003 #3
    So, what makes our definition of what "order" and "disorder" are objective? What enables us to say that "disorder" is something that is objetively measureable?
     
  5. Jun 12, 2003 #4
    Cook an egg and have your friend try to uncook it. No matter how long he observes with 'order' on his mind, the probability that the egg will somehow uncook itself is ridiculously low.
     
  6. Jun 12, 2003 #5
    Interesting concept and worthy of investigation, JSK. However, entropy doesn't deal with 'one physical act' but is a statistical measure.

    ORIGINALLY POSTED BY JSK333:
    "So, what makes our definition of what "order" and "disorder" are objective? What enables us to say that "disorder" is something that is objetively measureable?"

    Since by definition for a system of particles (at equilibrium!) dS/dU = 1/T ; then I suppose IF you could find a system where this general relation doesn't hold for different frames of reference, you could say entropy is subjective.

    Creator
     
  7. Jun 13, 2003 #6

    russ_watters

    User Avatar

    Staff: Mentor

    Entropy is a measurable quantity. It is NOT an arbitrary definition.

    Think of it this way: the light we get from the sun is a color we call "yellow". Thats an arbitrary name for a specific physical thing corresponding to the wavelength of light. If you prefer, you can call it "canary" but what you can't say about it is that it is at a shorter wavelength than blue.

    Similarly, you can call entropy "Bob" if you want to, but what you can't say about "Bob" is that it decreases with time. The value of "Bob" in a closed system increases with time. Period. Its like describing the wavelenght, not the popular name for it.

    Now, there is general agreement that a shattered glass is in a disordered state when compared to the glass before it was shattered. If you subtract the value of "Bob" for the shattered glass from the value of "Bob" for the whole glass (or any other similar scenario), you find that the answer is ALWAYS a positive number. So it makes sense that instead of calling this number "Bob" we call it something that describes what we see occuring: disorder. We could just have easily defined disorder to be a negative quantity, but the end result is the same - the number itself increases with time. So its just more convenient to use a positive number.

    In this context, all entropy really means is that it will take more energy to turn that broken glass back into a whole glass than it took to break it in the first place.

    Maybe you are just wondering if scientists observe something, arbitrarily decide it looks like disorder increasing, then assign it an entropy? No, they MEASURE the entropy, find it to be increasing, and then describe it as such.

    If what you really are asking is "How is entropy measured?", I'll explain: Take a chemical reaction - about the simplest one is splitting water into hydrogen and oxygen. It takes a specific quantity of energy (electricity usually) to do this. Now lets assume conditions are perfect - no loss in the electral wiring or in the power source, no heat lost from the container, etc. So you know EXACTLY how much energy it really took to split that water into hydrogen and oxygen. Now mix the two together and ignite. Again, assuming perfect conditions - the container is perfectly insulated to capture all of the heat generated, the ignition source's energy is subtracted out, etc. Using a termometer, you measure the temperature of the water in your container and find calculate the energy you got back by re-combining the hydrogen and oxygen. What you find is that it took MORE energy to split the water into hydrogen and oxygen than you get back by burning it. This difference is the entropy in the reaction.

    HERE is a site with a problem like I describe. Actually, there is an even simpler one - simply boiling water.
     
    Last edited: Jun 13, 2003
  8. Jun 13, 2003 #7

    marcus

    User Avatar
    Science Advisor
    Gold Member
    2015 Award
    Dearly Missed

    cant get out of my head the idea that the TEMP at which the energy is delivered or extracted matters.

    Like maybe you get the SAME amount back from burning the H2 and O2 as you had to invest in splitting the water

    but the energy to split was somehow more high-grade because at some point delivered at higher temp

    maybe more definition is needed
     
  9. Jun 13, 2003 #8

    marcus

    User Avatar
    Science Advisor
    Gold Member
    2015 Award
    Dearly Missed

    lowness of entropy is good so 1/T measures quality

    Russ here how I would modify your description

    lowness of entropy is good so the index of quality really should be 1/T

    the reciprocal of temp
    so very highgrade energy coming in at a very high temp gets
    tagged with a low 1/T number (which means it is good)

    and lousy waste heat that no one wants and you just have to get rid of is lowgrade energy that gets tagged with a high 1/T number which means bad.

    Then you go around and each batch of energy E you multiply by its 1/T quality label----and get E/T

    And you add those up (those E/T quantities are entropies or anyway contribut to the total entropy)

    You take the total E/T before and the total after the experiment.
    Which is in some kind of box.

    And the total energy might be the same! after as before. But its quality is poorer.

    So more of the batches of energy have a high (bad) quality index.
    More of the E in the system (after) has a high 1/T index

    So ∫ E/T has increased.

    well this is very casual and so on but maybe in the right direction
     
  10. Jun 13, 2003 #9

    marcus

    User Avatar
    Science Advisor
    Gold Member
    2015 Award
    Dearly Missed

    Ludwig Boltzmann's k

    Ludwig Boltzmann was an excellent flamboyant Viennese very different in character from the straightlaced Prussian Planck
    who was a bit Victorian in his style and whereas Planck gave us hbar, Boltzmann gave us k.

    And k is a universal fundamental constant which is a ratio of energy to temperature

    k is IMHO profoundly basic----in almost every formula in physics where the temperature T appears it appears in the form of
    an energy quantity kT

    k is an E/T type quantity. I am not saying it is an entropy, or that it is a heat capacity (heat capacity is also an E/T type quantity)
    I am leaving it completely open how to think of k

    But any formula for the speed of molecules at some temp or the brigthness of things glowing at some time or the you name it conductivity of a doped silicon solid state thing at some temp
    or the convection currents or the speed of sound in air at some temp-----every formula has a k in it.

    Boltzmann k has got to be very basic

    And it is joules per kelvin

    And likewise so is the E/T you use when you do the integral that defines entropy.

    I think if you do the accounting right and mention waste heat that appears when you generate the electricity used to electrolyze the water then everything you said is RIGHT!
    It may depend on the accounting and details.

    But if you simply compare all the energy before and after I think it is the same, whereas the entropy increases.
     
  11. Jun 13, 2003 #10

    drag

    User Avatar
    Science Advisor

    Re: Ludwig Boltzmann's k

    Greetings JSK333 !

    If you treat entropy as the increase in energy density
    in space-time then the reference frame just doesn't matter.
    Even if it may visually seem, for example, at some point (when dealing with relativistic velocities for example) that entropy
    is actually reversed it's just an illusion and the actual
    mathematical discription will clearly show that entropy
    keeps increasing as ussual in space-time.

    Live long and prosper.
     
  12. Jun 13, 2003 #11
    Here is one way to look at Entropy

    You can't extract energy from a system when everything is in equalibrium. Say you had some blackbody radiation in a cavity, no device, not even solar cells or other exotic methods would allow you to extract energy from the box, because the cells would emit as much as they absorbed. Now imagine that you removed all the radiation below a certain frequency, for example one electron volt. You have removed the system from equalibrium, now you can degrade the higher frequency photons to lower frequency, returning the system to equalibrium and extracting energy in the process.
     
  13. Jun 13, 2003 #12

    marcus

    User Avatar
    Science Advisor
    Gold Member
    2015 Award
    Dearly Missed

    Re: Here is one way to look at Entropy

    I like that way of looking at it.
    It is a good picture----trying to get energy by putting a PV cell inside a box of thermal radiation.

    It shows that the *frequency* at which the energy is delivered
    matters
    that is sort of a measure of quality like temperature (in fact they are proportional by kT/hbar in a some situations)

    so having a lot of energy does not always matter as much as having some *available* out-of-equilibrium high-grade energy

    Like having the photons in the box be all > 1 eevee

    1 eevee is roughly the average photon in sunlight

    the light in the box would be like sunlight with half the photons or so magicked away---the lower quality half. It would be a special high quality light, good for getting energy from. Interesting image.
     
  14. Jun 14, 2003 #13

    jeff

    User Avatar
    Science Advisor

    This is a statistical mechanical issue. Consider a particulate system whose macrostates, defined in terms of bulk thermodynamical properties - pressure, volume etc. - are, as you'd expect, determined by the detailed behaviour or states of the particles. The states of all the particles at a given moment is said to represent the current microstate of the system.

    The key thing to note here is that there will in general be many microstates that can produce a given macrostate, and according to statistical mechanics, for a given macrostate, the more microstates that can produce it, the greater it's disorder.

    The entropy S of a macrostate is defined in terms of the number Ω of the microstates that can produce it by the formula S=klnΩ in which k is boltzman's constant. Because Ω - being a dimensionless purely statistical quantity - is observer-independent, so is entropy, i.e. disorder.

    For general physical systems, the more featureless a macrostate, the greater is Ω, and thus it's entropy: nondescript piles of identical glass shards are more disordered and thus have higher entropy than wine glasses because there are more ways to arrange the shards to make such piles than wine glasses.

    A system is in a state of stable equilibrium when Ω can no longer be increased by purely natural processes.
     
    Last edited: Jun 15, 2003
  15. Jun 15, 2003 #14

    russ_watters

    User Avatar

    Staff: Mentor

    Marcus, a clarification here: it requres exactly the same amount of energy to raise 10 cc of water by 1 degree C as it does to raise 1 cc of water 10 degrees C.
     
  16. Jun 16, 2003 #15
    OK, again I must ask, how does one define "purely natural processes"?

    Before many of you replied, I was going to say: isn't entropy just a matter of probabilities/statistics?

    One gave the example of sorting balls by color in a jar, then shaking it. The probability that they'll end up sorted by color is very low. Well, the probability that they'll end up one color next to each other, repeating every other color for the total amount of colors, also very low.

    Isn't this all just a matter of what we consider "ordered"? What if we considered "random" ordered?

    Or is that colored ball example not a good example? :)

    Is entropy really just a matter of probabilities then?
     
  17. Jun 16, 2003 #16

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This is a common misunderstanding about probability.

    The chance of a roulette wheel spinning red is 1/2. The chance of it spinning black is 1/2.

    The chance of it spinning red, then red again is 1/4. The chance of it spinning black, then black again, is also 1/4. The same goes for red, then black and black, then red.

    The chance of every possible 1000-spin sequence is 2^1000. The probability of 1000 blacks in a row is no different from the probability of any other combination of 1000 spins.

    People seem to think a run of 10 black spins in a row is somehow spooky, but it's no more unlikely than any other pattern.

    - Warren
     
  18. Jun 16, 2003 #17

    jeff

    User Avatar
    Science Advisor

    A purely natural process is one in which there is no intelligent intervention so that it's behaviour is described by physics alone.

    Theories are man-made and therefore always a matter of definition. It's just that some definitions work better than others.

    Yes.
     
  19. Jun 17, 2003 #18

    russ_watters

    User Avatar

    Staff: Mentor

    And the second most common misunderstanding about probability is the "gambler's fallacy." If you hit black 9 times in a row, most people will put their money on red for the next time - but though the probability of hitting black 10 times in a row is 1/1024, the probability that for that 10th spin alone is still 1/2. Too bad you don't bet against each other in roulette - you could win a lot of money just by understanding a little statistics and psychology.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?