Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Second law of thermodynamics

  1. Feb 9, 2012 #1
    I am interested in finding as many unit less representations of the second law of thermodynamics as possible. other then gravity, it is the physical law that we experience most in daily life. to greater or lesser extents it can used to describe everything that controls our existence. I'm looking for as simple an equation as posible to describe that entropy strives towards a maximum. I have a couple of equations in mind but i would like to see what some fellow nerds can come up with.
     
  2. jcsd
  3. Feb 10, 2012 #2

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    Your premise is incorrect. Entropy does not strive toward a maximum. There is no maximum. All we can say is that in any theoretical process, the entropy of the universe cannot decrease and that in any real process, the entropy of a system and its surroundings will increase. Therefore, the entropy of the universe is always increasing.

    Also, entropy has units of energy/temperature so I am not sure how you can mathematically represent entropy without using units.

    AM
     
  4. Feb 10, 2012 #3
    "the energy of the world is constant; the entropy of the world strives toward a maximum"
    Clausius
    i'm sure you can find, as i have found multiple text books, both old and new, that have the exact phrase entropy strives towards a maximum. so don't start a war over semantics when you knew what i was talking about
    "Your premise is incorrect. Entropy does not strive toward a maximum. There is no maximum"
    your argument does not invalidate mine. any system has or can have a theoretical maximum. it is however a mater of resolution of measurement.
    but i should have been more clear in what i mean by a unit less equation. E=mc[itex]^{2}[/itex] is the equation for mass energy equivalence. however c is a unit and does nothing to convey to the the mind of someone else that matter can be converted to energy and vice versa only the ratio of the units in an actual calculation. namely the speed of light squared
    in my search for a simple representation of the second law of thermodynamics i did't want to involve units of temperature or heat flow etc. I didn't want to use
    [itex] \oint \frac{\delta Q}{T} \leq 0. [/itex]
    or even
    [itex] dS = \frac{\delta Q}{T} \! [/itex]
    I want to use the idea of entropy to define irreversibility without using a dictionary. I want a mathematical image that shows entropy always increases
    Plank's symbolism for this idea was [itex]{S-S}'\geq 0[/itex]
    I'm looking for something simple like this, where S represents entropy but is otherwise not very useful for calculations
     
  5. Feb 10, 2012 #4
    Doesn't that necessarily entail that the universe has surroundings?
     
  6. Feb 10, 2012 #5

    Ken G

    User Avatar
    Gold Member

    I think the most useful concept of entropy is the natural log of the number of configurations of the system that is counted within some allowed class or set of constraints that control the entropy. Often this gets multiplied by the Boltzmann constant k, but that's just because T has arbitrarily chosen units that bring in k. There's no reason not to measure T in energy units, and then S is unitless and you can just drop the k.
     
  7. Feb 10, 2012 #6

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    I thought you were talking about entropy approaching a maximum. Just because Clausius said it does not mean it is right. Clausius also said the energy of the universe is constant.
    Why can't the entropy of a system increase without limit?

    AM
     
  8. Feb 11, 2012 #7
    Thats exactly what i thought i had said. or i least i hoped thats what would be understood. what makes infinity not a maximum? by the strictest use of the word, maximum does imply a limit. in my mind with any real system entropy is going to be closer to infinity then zero (infinitely closer). so the maximum becomes infinity.
    when you replied to my original post with this statement
    I understood (or perhaps better, misunderstood) that statement to mean that you had placed a limit to entropy.
     
  9. Feb 11, 2012 #8

    Ken G

    User Avatar
    Gold Member

    There are many situations where entropy does reach a maximum, subject to whatever constraints are in play. This is how things like the Maxwell-Boltzmann velocity distribution or the Planck spectrum are derived.
     
  10. Feb 11, 2012 #9
    is there any universal time equations
     
  11. Feb 11, 2012 #10
    or any universal time equation
     
  12. Feb 11, 2012 #11
    I'm sorry, I don't understand why you're doing this.

    Some fellow nerds have already come up with it, it's called S=klnΩ. It is extremely simple and relates counting microstates to temperature and the entire concept of thermal equilibrium. That is pretty appealing. Why would you waste your time searching for an aesthetically pleasing way of saying the same thing? Especially when the way it is being said already is extremely simple and illuminating? What could be a better way to express entropy than an equation you could use to derive the precious Ideal Gas Law from with suitable assumptions? An equation that fully describes how temperature flows from hot to cold and how systems come to equilibrium? Using this equation you can predict the most likely state of a system and how much more likely it is than any other macrostate. Then given assumptions about how many times a you meausure a system per unit time you can determine how many times you would have to measure a system for it to go through all accessible microstates. What is more telling than that?I am not saying there is no theoretical work to be done in the area of irreversibility or the arrow of time, simply that you might as well study the phenomena and not try to re-arrange symbols for some personal infatuation with entropy...what are you trying to make a tattoo or something? ...
     
  13. Feb 12, 2012 #12

    Andrew Mason

    User Avatar
    Science Advisor
    Homework Helper

    Entropy is a quantity whose measurement requires an equilibrium state. The Maxwell-Boltzmann distribution describes the distribution of molecular speeds only in an equilibrium state. Similarly, the Planck distribution describes the energy distribution of photons in thermal equilibrium. So I don't see how entropy is used to derive those distributions.

    Maximum entropy for a closed system constrained to a certain volume will be achieved when all parts of the system are in complete equilibrium with each other and there is no internal source of energy or energy sink. At that point, energy will not flow within the system and, since it is closed to the rest of the universe and has fixed volume, it cannot exchange work or heat flow with anything else, so nothing will happen. That will be a state of maximum entropy for that system.

    AM
     
  14. Feb 12, 2012 #13

    Ken G

    User Avatar
    Gold Member

    Just start with the definition of entropy, and maiximize it, subject to the total energy available, and you get M-B without ever mentioning temperature or thermal equilibrium. That's the only difference-- if you derive the M-B distribution from thermal equilibrium, you are specifying the temperature, if you derive it from maximizing entropy, you are specifying the total energy. Put differently, if you have a gas in a box that is completely insulated from its surroundings, and the gas has a given internal energy, you can derive the Maxwell-Boltzmann distribution for that gas simply by maximizing its entropy subject to its internal energy. No T, no thermal equilibrium, just entropy. But it will be the equilibrium state, because by the second law it has nowhere else to go once it reaches maximum entropy. Ergo, saying that gas "seeks maximum entropy" is tantamount to saying it "reaches equilibrium", and hence, the two concepts are very close, and both are important to the usefulness of thermodynamics. If you change the conditions, the maximum entropy will change, so we cannot say they "seek maximum entropy" in some kind of absolute way, but we can say that this does indeed tend to happen given the specific constraints in place.
     
    Last edited: Feb 12, 2012
  15. Feb 12, 2012 #14

    Rap

    User Avatar

    I agree with KenG - A good way to look at entropy is to divide the thermodynamic entropy S by Boltzmann's constant k, which will give a dimensionless entropy, [itex]H=S/k[/itex] which is equal to Shannon's information entropy of the system. In the second law, the [itex]T\,dS[/itex] term is replaced by [itex](kT)\,dH[/itex], where [itex]kT[/itex] is a new temperature scale with units of energy. By doing this, you get rid of the rather artificial temperature units, and the Boltzmann constant is eliminated in favor of the new, very specially designed temperature scale.

    Now Boltzmann's famous equation [itex]S=k\ln W[/itex] becomes [itex]H=\ln W[/itex] where W is the number of different microstates a system could possibly be in that would exhibit the same macroscopic parameters of the state you are looking at. What this equation is saying is that, if you use base 2 logarithms, the info-entropy is equal to the average number of yes/no questions you would have to ask, in order to determine the microstate of the system, given that you know the macrostate (i.e. temperature, pressure, etc). Shannon's definition of information entropy, is basically just that - the information entropy (H) is the amount of missing information, and the amount of missing information is the average number of yes/no questions you have to ask to recover that missing information.

    Consider a digitized image 256x256, pixels are either black or white. If the left half is black, right half is white, how many ways can this happen? One. Ok, suppose you have blurry vision, cannot distinguish down to the pixel level, only to about a 4x4 box, then there are more ways, millions, maybe, that you couldn't see the difference between. But if you are looking at a picture that is flat grey, now how many ways? An enormous number of ways, something like 10^19 different ways that could give you flat grey with your blurry vision. If the pixels in the half black half white picture start changing randomly, the picture will start to turn flat grey, and the number of ways climbs towards that maximum. The entropy increases.

    In the thermodynamic system, just as with this example, you cannot see the individual molecules, your macroscopic equipment "blurs" the system you are looking at, only being able to measure temperature, pressure, etc. for a small volume containing many molecules, not the individual molecular energies. The collisions between the molecules cause each molecule to change its energy, just as the pixels started randomly changing their color. Eventually the system you are looking at can be represented by a huge number of possible energies of the individual molecules. Just as the picture goes grey and the number of possibilities becomes huge, so the gas temperature, pressure, density go flat and the number of possible ways becomes huge. The entropy increases.
     
    Last edited: Feb 12, 2012
  16. Feb 12, 2012 #15

    Ken G

    User Avatar
    Gold Member

    That's a very nice description of information entropy, thank you. A crucial point you make is that the entropy is not a physical entity, it depends on what we claim to know about the system, and what we are choosing to treat as unknown (or effectively unknowable).
     
    Last edited: Feb 12, 2012
  17. Feb 12, 2012 #16

    Rap

    User Avatar

    I get worried about saying it is not a physical entity. We can envision various different ways of measuring the same system, and we will come up with different values of entropy, each of which are valid, so entropy is not on the same footing as temperature, pressure, etc. However, the differential dS does not change, as long as things don't get too microscopic (your vision does not come too close to perfect in the digital picture analogy), so it has more "physicality" than the entropy S. I also have trouble intuitively understanding [itex]dU=(kT)dH[/itex] (assuming constant volume and number of particles). I have trouble understanding the meaning of (kT) in this formulation, how the amount of missing information yields the internal energy. I can do the math, the whole derivation of Boltzmann, etc. etc. and it all makes mathematical sense to me. I understand kT is twice the energy per degree of freedom, etc. but I feel like I still don't intuitively get it. I mean if H is the number of yes/no questions, then kT is the energy per question. I'm having trouble with that.
     
  18. Feb 12, 2012 #17
    The process of increasing entropy is reversible adiabatic process
     
  19. Feb 13, 2012 #18

    Ken G

    User Avatar
    Gold Member

    The way I see it, it's not that the missing information yields the internal energy, it is that the former is how we can understand the presence of the latter. The fundamental rule is that missing information can be cast in terms of a number of equally likely states, the counting of which quantifies the missing information, as you so clearly explained. But the number of equally likely states also connects with the likelihood the system will find itself in that class of states, and that in turn connects with the affinity of a system to draw energy from a reservoir.

    It is the reservoir, not the system, that brings in the concept of kT-- kT means the energy that the reservoir "covets." By that I mean, if you add kT of energy to a reservoir, you increase by e the number of equally likely states that the reservoir has access to. This is really the meaning of T. Interestingly, it doesn't matter how big the reservoir is-- a lump of coal or a planet, if both are at 1000 K, will "covet" the same energy kT, and will both have their number of states multiplied by e if they get kT of energy. So given that reservoirs covet energy in this sense, they are also loathe to part with it, but they can be coaxed into parting with kT of energy if some other system can have its number of accessible states increase by more than the factor e by receiving that energy.

    The net result will be an increase in the number of accessible states for the combined system, and so by sheer probability, this is more likely to happen. Heat will continue to cross out of the reservoir and into the system until the next kT of energy only increases the number of states in the system by the factor e (or more correctly, the next dQ increases the number of states by only the factor 1+dQ/kT), at which point we have equilibrium because the number of total states cannot be increased any more (nor can the entropy, as you point out). So what this all means is, there is a connection between the number of questions you need to ask to pinpoint the particular state of the system out of the full class it might be in, and the fact that a big full class has a proportionately high probability of being belonged to. The place where the internal energy comes in is that the more states the system can gain access to, the better it is at drawing energy from the reservoir, to maximize the total number of combined states, and thus also maximizing the number of questions you'd need to answer to cull out the actual state from the class of possibilities.

    I hope you now see that the reason for this is that each question you need to answer represents the presence of states that perfectly offset the loss of those states by the reservoir when it loses its "coveted" energy (the total number of states being the product, not the sum, of possible states in each component). So it's all about maximizing the combined number of states that the full system+reservoir has access to. The reason I said it depends on what we know, rather than something physical outside of us, is that the actual state of the combined system is always just one thing-- it is only how we classify it and group it with indistinguishably similar states that we come upon the concept of entropy and the concept of probability of belonbing to that classification group. But you're right, the energy is there, that much is physical-- it is the explanation for why that energy is there that depends on how we classify things. The "real physical reason" the energy is there must depend on microphysics that we are simply not tracking, not on entropy. But the entropy is a story we can tell, based on what we are tracking, that can be used to determine how much energy will come across from the reservoir, via microphysics that is not in our story but is the real physical reason for that energy being there (if there is such a thing as a real physical reason).
     
    Last edited: Feb 13, 2012
  20. Feb 13, 2012 #19

    Rap

    User Avatar

    Ok, I will have to read that more than once and think about it. Give me a few days :)
     
  21. Feb 16, 2012 #20

    Rap

    User Avatar

    In my mind, I say ok, suppose we have a reservoir at temperature T and two systems, large (L) and small (S), each at somewhat lower temperature T'. If I transfer kT to the small system, [itex]dU_S=kT=kT'dH_S[/itex]. If I had transferred to the larger system, [itex]dU_L=kT=kT'dH_L[/itex] so it looks like dH's are the same, equal to T/T' (which is larger than one question, but the same for both).

    I don't see how the larger system is "better at drawing energy from the reservoir". Am I misinterpreting your statement?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Second law of thermodynamics
Loading...