1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is entropy, really?

  1. Aug 29, 2009 #1
    This must have been posted on here before, but I can't find any reference to it.

    I've had to learn a little biochemistry related to my work. This led me to realise that I knew little chemistry, and set me out to learn some chemistry. Chemical reactions are significantly dependent, according to what I read, on entropy. This got me looking at entropy again, which I remember vaguely from school.

    Some books try to explain entropy in terms of order/disorder. This seems a bit of a poor explanation to me. For example, they show a box with two gases in it separated by a barrier: the barrier is removed, and the gases mix. Thus order -> disorder, and the entropy increases. This still seems to beg the question of what "order" is. Also it helps not one whit when trying to explain what entropy means when you are trying to define Gibbs free energy.

    An attempt I came across to tighten up the gas-dispersion explanation of entropy stated that it was a movement from low-probability state to a high-probability state. Of course, in this example this is nonsense, because any particular arrangement of gas particles is as equally probable as any other: we only have a "higher-probability", as we have grouped a larger number of particular arrangements into a single category, and so therefore a category with a large number of arrangements ("dispersed") has a higher probability than a category with a lower number of arrangements("compacted"). We can arbitrarily create any kind of categorization we like to describe arrangement of gas particles. Thus entropy and chemical reactions depend on what categories we choose to define. This seems very improbable, or at least a very confusing explanation.

    Then there was the attempt to explain it in terms of information. We have good information on where the gas particles are at the start, the instant the barrier is removed. As the two mingle, we have less information on the relative locations of the gas particle of one kind relative to the gas particles of the other kind. This is really just the same as the order/probability explanation given before, but in different terminology. Still the same problem arises: does a chemical or physical reaction depend on how much knowledge we have about a system: surely the chemical and physical event occur even if don't know about them. And even if we had perfect knowledge, the reaction would still happen. We in fact do have pretty good knowledge of the reaction that occurs when a chromosome replicate itself: we know the structure of the chromosome, and the intermediate molecules that read it, and reconstruct its copy. Our near-perfect knowledge has no effect on the reaction.

    So we'll drop this order/probability/knowledge analogy, unless someone explains it better to me.
     
  2. jcsd
  3. Aug 29, 2009 #2
    Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

    If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.
     
  4. Aug 29, 2009 #3
    I've been reading Keeler and Wothers Why chemical reactions happen.

    Then they bounce immediately into explaining entropy mathematically, in relation to heat. Of course, if we describe heat as roughly a measurement of the kinetic energy, or alternatively, the momentum of the movement of particles, there is a connection with the order/disorder explanations, but it is a bit tenuous.

    So entropy S, itself, nobody seems to want to define in precise terms. However, they are happy to define the change in entropy dS, in terms of heat Q and temperature T.

    dS = Q / T

    Fine. So entropy is measured in Joules and is a measure of the heat absorbed. Then they, and people posting in Wikipedia and elsewhere happily substitute S for Q/T and vice versa.

    How is this related to the specific heat capacity, which I remember studying in school? Specific heat capacity of a substance is the heat energy in Joules required to raise a certain mass of the substance by a certain number of degrees. (Kilogram per degree, when I did it, but you could use moles or Fahrenheit or whatever, I suppose.) The specific heat "c" for a body of mass "m" is given by

    Q = mc * dT

    So therefore

    mc = Q / dT

    and is specific heat is of course mesured in Joules as well. If you use mass as a single unit (one kilo or one mole) and also had the change in temperature a fixed unit, then specific heat capacity would equal entropy. What is the actual difference? Is there one?

    Also we were given tables when I was at school that implied that specific heat capacity was constant at different temperatures, although it would appear perfectly reasonable to me if it were to change as temperature change, even to change for different substances in different ways.

    The equation we had for entropy assumes a constant temperature during heat loss or gain, which is not particularly plausible, except for infinitesimal changes. The specific heat capacity equation assumes that the temperature is changes as the substance gains and loses heat. This seems far more reasonable. However, it makes entropy seem an even more confusing and ill-defined concept.
     
  5. Aug 29, 2009 #4
    That would mean that if I make more precise measurements - say a kind of Hamiltonian vector of all the particles (I remember Hamiltonian vectors from Penrose's Emperor's New Mind) - then the chemical or physical status of the reaction would be different, than if I just took a general measurement of its temperature, volume and density? I must be misunderstanding you.
     
  6. Aug 29, 2009 #5

    diazona

    User Avatar
    Homework Helper

    You raise a good point. What they don't usually tell you is that the categories are determined by a certain set of thermodynamic properties.

    Basically, any system can be in one of a large number of possible microstates. A microstate is just a particular arrangement of the particles in the system. Now, each microstate has a certain pressure P, temperature T, and number of particles N, and you can group the microstates by the values of those three variables, so that all the microstates in each group have the same values of P, T, and N. These groups, or macrostates, are the "categories" you're thinking of. By measuring the pressure, temperature, and number of particles (well in practice usually you'd measure something else, like volume, and then calculate number of particles), you can tell which macrostate a system is in, and the number of microstates in that macrostate determines the state's entropy.

    Why do those particular variables, and no others, determine the groups? Well, actually you can use certain other variables, like volume in place of number of particles as I mentioned. But you still need 3 variables which are related to P, T, and N. I think the reasoning there is that the 3 variables represent the 3 ways in which a system can interact with its environment: it can transfer heat, it can do work, or it can gain or lose particles. And it stands to reason that from the point of view of the environment, the state of a system is fully defined by how it can interact - beyond that, what's really going on inside the system doesn't matter.
     
    Last edited: Aug 29, 2009
  7. Aug 29, 2009 #6
    Let me put it another way. I have test-tube A and test-tube B, that contain substances X and Y. I measure them in whatever way, and discover that the particles are more randomly distributed A than in B: X particles tend to be higher up the tube, say. Therefore, the mixture in test-tube B has higher entropy that that in test-tube A. Fine.

    Then, by using some super measurement, I discover that there is a precise mathematical distribution, perhaps some kind of fibonacci sequence, that determines the exact position and movement of the particles in test-tube B: I wasn't aware of this before. So then, suddenly, test-tube A now has the higher entropy. Again, the problem is that we are making the chemical reaction dependent on my knowledge of the substances involved.

    Which brings me to the same point: unless you have a precise technical definition of "microstate", to define the exact microstate of system, I have to describe all the positions/momentums of the particles in it. This is exactly the same quantity of information, no matter what state the system is in.
     
  8. Aug 29, 2009 #7
    That's false. A microstate does not have a temperature at all.
     
  9. Aug 29, 2009 #8
    Thanks, that, and the rest of your post makes sense to me. I can see how that connects with the 2nd law of thermodynamics, both when stated with and without reference to entropy.

    Of course, the pressure, temperature and number of particles of system won't necessarily change in the spreading gas example, or in the other favourite example people use, the melting ice cube example, but I'll just put that down to misleading examples, and follow yours because it makes sense. So entropy is a relationship between a macrostate and a theoretical (but I presume practically incaculable) number of microstates.

    How does this relate to heat transfer and Gibbs energy then? Or to the mathematical definition of entropy as a quantity of heat abosorbed at a certain temperature?
     
  10. Aug 29, 2009 #9
    so what actually is a "microstate" then?
     
  11. Aug 29, 2009 #10

    If you know the exact state of the system, and if it is an isolated sstem, you could simply predict the state it will be in some time later.


    A microstate of a closed system is the exact quantum state of the system. If you have one particle in a box, then it will have certain energy levels in which it can be in, just like an electron of the hydrogen atom. If you specify the the energy of the particle in the box with some small uncertainty, then it can be in energy level that falls within that uncertainty. The amount of information you need to specify exactly which one of the energy level the particle really is in is proportional the the logarithm of this number (the logarithm will be proportional to the number of digits of this huge number).

    Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.
     
  12. Aug 29, 2009 #11
    Last edited by a moderator: Apr 24, 2017
  13. Aug 29, 2009 #12
    Don't those two sentences contradict each other? Unless you have a definition of "accessible"? Does "accessible" just mean "that we can predict"?

    If I know something about the status of a system at time t, and it is changing randomly, then at time t+t' I know less about it, unless I make new measurements. That is obviously true, but it can't be what you mean. It would imply that if someone had taken measurements before me, then the entropy of the system would be, for them, higher than the entropy of a system for me.

    So if we were measuring two different substances, and then the two substances reacted, the entropy of the two systems could be different for each of us, which would imply that we would each observe a different reaction. I am sure this is not what you mean.
     
  14. Aug 29, 2009 #13
    Last edited by a moderator: Apr 24, 2017
  15. Aug 29, 2009 #14
    What I'm looking for is a definition of entropy that does not use the term "entropy" in trying to define itself, which was why I was happier with the S = Q/T definition, even it it didn't make much sense on reflection. (As T could not remain constant under such conditions)
     
    Last edited by a moderator: Apr 24, 2017
  16. Aug 29, 2009 #15

    Well, S = k Log(Omega) is the definition of entropy for a system in thermal equilibrium. You can derive the relation dS = dq/T from that definition, not the other way around.
     
  17. Aug 29, 2009 #16
    Accessible means that the state is a possible state the system can be in given what we know about the system. If the energy of a state does not fall within the known limits of the total eenrgy of the isolated system then, due to conservation of energy, that state is not an accessible state.

    The fundamental postulate of statistical physics is that all accessible states are a priori equally likely. If a system is in equilibrium, then it can be found in all these accessible states with equal chance. If a system is not in equilibrium, then that means that it is more likely to be in some accessible states.

    E.g. suppose you have a gas that was constrained to move in one part of an enclosure and you remove that constraint. Immediately after the constraint has been removed the gas has not had the time to move into the other part. So, not all of the states are equally likely. To say that they are a priori equally likely means that if you are given a system in which the gas can move freely as in this example and you are not told anything about its history, then all the states are equally likely. But the conditional probability distribution over all the accessible states, given that a constraint has just been removed, is not a uniform distribution.
     
  18. Aug 29, 2009 #17
    You are well named Count Iblis, the charm and devilry of your explanations does you credit.

    I think we can avoid going into the explanation of eigenstates and the adiabatic theorem of quantum mechanics. The maths is entertaining. I notice though, that that definition depends directly on the total energy of the system, which brings us back to the definition that entropy is just a measurement of energy gain.

    That definition is really equivalent to the definitnion that dia... whatver his/her name was ... gave: that it depends on the relationship between the microstates and the macrostate: the equation defines the macrostate in terms of the total energy in the system - presumably this would be the theoretical energy required to raise the system from absolute zero to its current temperature, although there could be other interesting complications ^- dia....whatever defined it in terms of .... was it pressure, temperature, number of particles, ... would ultimately amount to the same thing.

    Entropy was originally created to describe the mechanics of steam engines, and I think we can more or less keep it at that level.
     
  19. Aug 29, 2009 #18
    I'm still very charmed.

    The a priori equally liklihood of states is the assumption I'm making too.

    If the entropy depends on our knowledge of the history of the system, then two observers who had different knowledge could observe different reactions.

    Saying that it depends on knowledge is not the same as saying that it depends on the energy level of the system, and energy transfer between one system (0r one part of a system) and another. Which is what your fancy equation said.
     
  20. Aug 29, 2009 #19
    So if I have chart that says, inter alia, regarding the values of absolute entropy at 298 K

    He (g): 126 J K-1 mol-1

    This means that another system brought into contact with He at 298 K will gain 126 J per K of difference in temperature per mol of He. Is that correct?
     
  21. Aug 29, 2009 #20
    But then, this site

    http://www2.ucdsb.on.ca/tiss/stretton/Database/Specific_Heat_Capacity_Table.html

    says that the specific heat capacity of He Gas at 26 C (= 299 K??) is 5.3 J per gram.... per degree K or C...

    And http://www.ausetute.com.au/moledefs.html" [Broken], a mole of He is 4.003g,

    SO

    If a (cooler) substance is brought into contact with He gas at 299 K it will gain 5.3 x 4.003 = 21.2159 J per degree difference in temperature per mole.

    I'm doing something wrong: what is the relationship between specific heat and entropy then? If any?
     
    Last edited by a moderator: May 4, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: What is entropy, really?
Loading...