1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is entropy

  1. Apr 8, 2013 #1
    In my TD classes I have to deal with entropy. But what does it really mean? I know that it is the degree of disorder and the equation S=q/t. But what is this disorder? How can a physical system be aware of order and disorder since it is an abstract concept?They say that solids have high order than liquids and thus low entropy. Do entropy have any relation with vander walls forces and potential energy of system which predominates in solids than liquids. Why should the system try to increase entropy?Is it an attempt to reduce P E by keeping particles in a distinct manner? Please help.Thanks in advance.
  2. jcsd
  3. Apr 8, 2013 #2
    Rather than "disorder", know that entropy is the logarithm of the number of microstates available to a macrostate (or system). You can forget the logarithm for understanding, its there to help with calculations. Entropy is the number of microstates available to your macrostate. Entropy is a function of the macrostate you are considering.
  4. Apr 8, 2013 #3
    Wow... that's confusing!

    It makes sense that fluids have a higher disorder because they have more motion - think about how in a solid the atoms are packed close to each other - there's less motion.

    I don't know if that's the correct approach but it's the one that made sense to me.
  5. Apr 8, 2013 #4
    Well thinking about it purely from macro perspective , liquids , fluids and gasses have higher entropy than solids ,as water is easy to mix with other liquids so is air , whenever you open the window the air rushes to get equilibrium temperature so the cold one comes in and the hot one goes out until they perfectly mix.

    Well subatomic particles and quantum states have their own entropy and from that entropy and states macroscopic things and phenomenon get their corresponding states and properties although they don't have to be similar they can be different.

    Just like with the hot and cold air mixing and the hot air rushing towards the cold so does all matter tend to go towards bigger entropy and energy tends to be converted via different mechanisms so that not only particles but whole systems always tend to go towards lower energy levels with time.
  6. Apr 8, 2013 #5
    one of the least confusing and overall flexible definitions I've come across on entropy is as follows

    Entropy represents the diversity of internal movement of a system.

    its from this site


    personally I would love to see the terms order disorder stricken from any and all definitions of entropy

    the above unfortunately is not easy to convert to information entropy I'm sure there are other usages of entropy that the above does not describe well. Unfortunately I don't think there is a good single definition of entropy that can cover all its usage. At least not one that is easily understood by any student of any academic field or layman
    Last edited: Apr 8, 2013
  7. Apr 8, 2013 #6
    Why should a system be aware of anything? No need to bring consciousness into it. Entropy as a statistical concept was defined in post #2, and this logic should lead you to why entropy tends to increase- if you put a system in a low entropy state then there are only a few microstates corresponding to whatever macrostate you have. But atoms in a gas or solid are always interacting, colliding and transferring energy to each other. This happens so much that over some time, the entropy will increase to the point where there are many microstates corresponding to whatever macrostate you have.

    It's a purely statistical result, if you start with 100 bottles of beer on the wall and then leave them in a storm, after a while you're going to have much less than 100. The macrostate of "all bottles still on the wall" corresponds to a single microstate "bottle 1 on the wall, bottle 2 on the wall... bottle 100 on the wall", and your system entropy is kb*log(1) = 0.

    After 50 have blown off, your new macrostate "50 bottles left on the wall" corresponds to quite a lot of microstates. Bottles 1 through 50 may have fallen off. Bottles 1 through 49, plus bottle 51 may have fallen off. My maths is awful but I'd briefly guess that "100 choose 50" is the number of microstates of this new macrostate, so your new entropy is 66*k_B.

    Obviously this example is pretty awful because there is a clear tendency for bottles to fall rather than get picked back up and replaced. But in a system of gas particles, or a set of paramagnets or all the standard examples, there is no macroscopic object (like the wind) forcing the bottles off the wall- just microscopic interactions which are completely random. And yet the system's macrostate will drift towards one with more and more corresponding microstates, so that entropy increases.

    The more I think about it, the worse that example was
  8. Apr 8, 2013 #7
    definition #2 is good but not easily understood, take my situation of having to explain entropy to new apprentice boiler operators or instrumentation techs. Thats where I found the definition I posted most useful. Granted this is a scientific forum, however that does not mean we cannot simplify a definition so that it is easily understood by the layman regardless of field of academic study.
  9. Apr 8, 2013 #8
    I'm not sure I agree with that article. Entropy is not heat capacity; it is the amount of heat per kelvin that is not available to do useful work. If your system has half the molecules at 100K and half at 200K then its entropy is lower than a system with all its molecules at 150K. That is because there are fewer microstates corresponding to the nonuniform case than there are to the uniform case. And, as a result, the nonuniform system can do more useful work than the uniform system because a system in thermal equilibrium is in a state of maximum entropy/minimum free energy.
  10. Apr 17, 2013 #9
    Laymans explanation:

    If you do a really bad fart, it's almost certain that everyone in the room will smell it!
    The fart invariably diffuses.
    The are many permutations of ways the fart in the room can exist in a diffuse state compared to the eventuallities where the fart stays in clumped together.

    The fart diffuses because of entropy.
    Entropy increases with time.
    Leave the room quickly after farting.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook