1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Some confusion re entropy

  1. Jun 14, 2018 #1

    Buzz Bloom

    User Avatar
    Gold Member

    I find my self quite confused about some aspects of the concept of entropy. I will try to explain my confusion using a sequence of examples.

    All of the references I cite are from Wikipedia.

    Ex 1. Does the following isolated system have a calculable value for its entropy?

    Assume a spherical volume of radius R1 containing N1 moles of hydrogen H2 molecules at a temperature T1 (Kelvin), at which the hydrogen has a gaseous state (except perhaps for an extremely small fraction of molecules). Also assume that the boundary of the sphere consists of a perfect insulator which maintains the temperature at T1, and that sufficient time has elapsed for the system to be in a state of equilibrium. To the extent that H2 is close to being an ideal gas, the temperature, volume, and pressure are related by
    https://en.wikipedia.org/wiki/Ideal_gas_law
    PV=nRT
    where P, V and T are the pressure, volume and absolute temperature; n is the number of moles of gas; and R is the ideal gas constant.
    The volume is
    V1 = (4/3) π R13.​
    https://en.wikipedia.org/wiki/Entropy
    In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB.
    My understanding is that the answer to (1) is YES. My confusion at this point is that the concept of "microstate" is unclear. Here is a definition.
    https://en.wikipedia.org/wiki/Microstate_(statistical_mechanics)
    Treatments on statistical mechanics, define a macrostate as follows: a particular set of values of energy, the number of particles, and the volume of an isolated thermodynamic system is said to specify a particular macrostate of it. In this description, microstates appear as different possible ways the system can achieve a particular macrostate.​

    This definition does not help me understand how many microstates exist in the (1) system. One interpretation might be that because the (1) system is assumed to be in equilibrium, there is only one microstate, the equilibrium state, and therefore it's entropy is zero. This implies that for all non-equilibrium macrostates, the entropy is always negative, since as the system changes moving closer to equilibrium its entropy will increase towards zero. (Deeper considerations that I plan to discuss later have convinced me that this interpretation makes plausible sense. Although I find this interpretation to be plausible, I have no confidence that it is correct.)

    If my YES answer above is incorrect and/or my interpretation that the entropy of an equilibrium system is zero is wrong, I would much appreciate someone helping me understand the correct answers, especially regarding microstates.
     
  2. jcsd
  3. Jun 14, 2018 #2

    Grinkle

    User Avatar
    Gold Member

    @PeterDonis referred me to some statistical mechanics reading a while back, and based on what I have studied so far and recollect -

    The entropy is the count of microstates that would result in a given macrostate. Any real system is only in one such microstate at any given instant, but that does not mean it has an entropy of zero.
     
  4. Jun 14, 2018 #3

    anorlunda

    Staff: Mentor

    Doesn't the coin example from the same Wikipedia article that you linked help?


    https://en.wikipedia.org/wiki/Microstate_(statistical_mechanics)

    Microstate_Figure.jpg

    With 32 coins, you have ##2^{32}## microstates, but still on one macrostate with all H and one with all T.

    With gas in a room, the number of microstates is directly proportional to the number of gas particles. Now consider the thermal equilibrium macrostate and how many different ways there are to arrange those particles that satisfy that state.
     
    Last edited by a moderator: Jun 14, 2018
  5. Jun 14, 2018 #4

    Buzz Bloom

    User Avatar
    Gold Member

    Hi anorlunda:

    Thank you for your response. I may be dense, but I am unable to make a reasonable guess about how to count the microstates for the Example 1 system. I can count the number of H2 molecules, but all I know now is that each definitely does not have the two microstates of heads and tails. How many energy levels are relevant. There are an infinite number of possibilities, but that can be made finite by arbitrarily setting a finite range for each defined energy "microstate". Is that the way the counting of energy microstates should be done? If so, what criteria should be used to set these ranges? Also, how about a molecule's position? It seems to be a similar problem as with energy. If a larger volume is assumed, does that that mean that there are more position states? (BTW, the examples I expect to introduce in a few days will suggest that this is not so about volume, but I have little confidence this reasoning is correct.)

    Regards,
    Buzz
     
  6. Jun 14, 2018 #5

    anorlunda

    Staff: Mentor

    Let me recommend an excellent way to learn. It is a course on Statistical Mechanics taught by Leonard Susskind.
    He is a good teacher, the math is not very difficult, and the videos are available free on Youtube. In the first 45 minutes of lecture 1, he explains entropy in principle. In later lectures he does it for a thermodynamic gas.

    That is a much better way to learn, than a simple question and answers consisting of a few sentences on an Internet forum.

     
  7. Jun 14, 2018 #6

    Buzz Bloom

    User Avatar
    Gold Member

    Hi Grinkle:

    Thank you for your response. I get the concept of the above quote. My "guess" that a system which is in a state of equilibrium has only one microstate is intended to mean that this single microstate is its state at all times, and it does not change with time. This means there is only one microstate range for energy to exist in (the entire range from zero to infinity), and also only one microstate range for where a molecule is (the entire volume). I hope to present an explanation for this odd interpretation when I prepare my next examples.

    Regards,
    Buzz
     
  8. Jun 14, 2018 #7

    Buzz Bloom

    User Avatar
    Gold Member

    Hi anorlunda:

    Thank you for your suggestion. I will take some time to look at this lecture. However, I do not expect it to help me understand the confusion I now have which leads me to the odd interpretation I presented (one microstate with some minimum elaboration in my post #6). My deep confusion comes from considering a more complex example involving a hypothetical GR based finite spacially curved universe in thermodynamic equilibrium with only protons and electrons as matter particles (no neutrons). More details to come later.

    Regards,
    Buzz
     
  9. Jun 14, 2018 #8

    Grinkle

    User Avatar
    Gold Member

    No, it means that the thermodynamic state of the system is not changing with time. But since there are many microstates that are all equivalent with respect to the thermodynamic state, the system can and does change from one microstate to another. For instance, a gas that is at equilibrium still has its molecules moving around. This movement does not change any of the thermodynamic state variables of the overall system if it is at thermodynamic equilibrium with its environment.
     
  10. Jun 14, 2018 #9

    Buzz Bloom

    User Avatar
    Gold Member

    Hi Grinkle:

    If I abandon my interpretation and accept (for the purpose of this discussion) that the microstate does change with time, can you help me qualitatively understand the consequences of a simple change to Example 1?

    Ex 2. The system for this example is identical to system (1) with the exception of the following changes:
    R2 = 2 R1
    T2 = (1/2) T1
    V2 = 8 V1
    Using the Ideal Gas Law to calculate approximate values for pressure P:
    P2 ~= (1/16) P1
    Does system (2) have more, or less, or the same entropy as compared with system (1)? I understand that this question may not have an answer based on the laws of thermodynamics, but if that is the case, then I would like to understand the reason why it does not.

    When I consider the system I will describe at a later time, involving a hypothetical universe (briefly introduced in my post #7) a violation of the second law occurs unless the above answer is "the same".

    The Wikipedia presentation of the 2nd law omits any statement about regarding reversible processes
    https://en.wikipedia.org/wiki/Laws_of_thermodynamics
    Second law of thermodynamics: In a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems increases. Equivalently, perpetual motion machines of the second kind (machines that spontaneously convert thermal energy into mechanical work) are impossible.​
    The universe example I am thinking about involves a single dynamic system undergoing a reversible equilibrium process, so I found another reference.
    https://www.google.com/search?q=the...+wiki&ie=utf-8&oe=utf-8&client=firefox-b-1-ab
    The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time. The total entropy can remain constant in ideal cases where the system is in a steady state (equilibrium), or is undergoing a reversible process.​

    Perhaps this is a sufficient clue about where I am headed with the universe example.

    Regards,
    Buzz
     
  11. Jun 14, 2018 #10


    One of the microstates of your system is: all molecules packed at the middle.

    That microstate looks like a non-equilibrium state, and it looks like a low entropy state too. But the odd looking microstate has the same energy, number of particles and volume as the more normal looking microstates. So the odd looking state is a microstate of the same macrostate as the normal looking microstates.

    Does the system have low entropy when all molecules are packed on one point? No. Large number of microstates are available to the system, so entropy is large.

    Is the system in equilibrium when all molecules are packed on one point? No.

    So don't say your system is in equilibrium, if you are planning to calculate entropy by counting all available microstates of your system. Or don't assume the system stays in equilibrium.
     
  12. Jun 14, 2018 #11

    Grinkle

    User Avatar
    Gold Member

    I don't think so. Any two state variables will fix entropy (and all other thermodynamic state variables) for an ideal gas. You definitely changed volume, and you did something I don't understand the implications of by changing temperature and the ideal gas constant inversely at the same time. I can't say for sure that you changed the thermodynamic state, because its been too long since my undergrad thermo class, but I suspect you did.

    Edit:

    Also, I should note that your changes are macrostate changes, not microstate changes.
     
  13. Jun 14, 2018 #12

    Grinkle

    User Avatar
    Gold Member

    I don't see where you are headed.
     
  14. Jun 14, 2018 #13

    Stephen Tashi

    User Avatar
    Science Advisor

    The only way you can have a finite number of microstates is to divide position, velocity, and energy into a finite number of "compartments", each of finite size. So if you are considering position, velocity, and energy to be continuous quantities, you don't have a finite number of microstates. The general pattern of thermodynamic arguments based on microstates is that certain formulae are true when we use a finite number of microstates and we take the limit of these formulae as the finite size of the compartments approaches zero. to get the thermodynamic laws.
     
  15. Jun 14, 2018 #14

    Nugatory

    User Avatar

    Staff: Mentor

    That's not the right interpretation. Equilibrium is one macrostate: we have a static cloud of gas with ##PV=nRT##.

    A microstate is what you have when you specify the position and velocity of each and every molecule in the gas - and note @Stephen Tashi's post above about discretizing these continuous variables and then taking the limit as the compartment size goes to zero. Each molecule has six degrees of freedom (position and velocity along three axes), and if any of these change (for example, two molecules collide changing their direction of motion) the system moves from one microstate to another. Clearly a huge number of microstates are consistent with any macrostate; and the system will continuously evolve from one microstate to another as the molecules bounce around at random. The number of microstates consistent with the equilibrium macrostate is so enormous that statistically the system will always be in one of these microstates, hence will stay in equilibrium.

    Don't write off the coin example mentioned by @anorlunda too quickly. The coins have one degree of freedom with only two possible values (heads or tails) instead of six degrees of freedom with many possible values, but the logic is the same. A macrostate is something like "There are fifteen heads and seventeen tails" and a microstate is something like "coin one is heads, coin two is tails, coin three is.....". Random shaking means we'll get a random microstate, but the odds heavily favor always being in a macrostate with roughly equal numbers of heads and tails.
     
  16. Jun 14, 2018 #15

    Buzz Bloom

    User Avatar
    Gold Member

    Hi Grinkle:

    I do not understand your statement which I quoted above. You seem to be saying:
    (a) you do not think that system 2 has greater entropy than system (1),
    AND
    (b) you do not think that system 2 has less entropy than system (1),
    AND
    (c) you do not think that system 2 has the same entropy than system (1).

    Is this your intention?

    My intention is that system (2) is does not have the same macrostates as system (1). The ideal gas constant kB is not changed. I used the equation
    PV=nkBT.​
    if we put this into the form
    P=nkBT/V,​
    then
    P2/P1 = T2/T1 / V2/V1 .​
    The assumptions I specified regarding the system (2) R and T variables in terms of the system (1) R and T variables then lead to the ratio of the given V ratio, and then the ideal gas law leads to the ratio for the P variable.

    Systems (1) and (2) are both in equilibrium state, but this is not the same macro state. I have in the back of my mind that it should be possible to define a reversible process by which system (1) can be transformed into system (2), and back again. If this is so, then this would imply (by the 2nd law statement as I quoted it from google.com) that the entropy of system (1) and the entropy of system (2) are the same.

    I hope this clarifies what was unclear to you.

    I apologize for suggesting that the "hint" I indicated might be sufficient to suggest the role of the universe examples I will describe in a few days. What I bolded above is my reason for intending to post examples about a hypothetical universe. The examples will be analogues of system (1) and (2) involving the hypothetical universe. The analogy will attempt to provide a reason to believe that there is a theoretical possibility that there is a reversible process between (1) and (2).

    Regards,
    Buzz
     
  17. Jun 14, 2018 #16

    Buzz Bloom

    User Avatar
    Gold Member

    Hi Stephen and Nugatory:

    Thank you for you posts. I do understand the general process about establishing a series finite ranges for the position and velocity of each molecule (see my post #4) , and also that the prediction method of Statistical Mechanics takes the limit of this series (as the ranges converge to zero) when calculating predictions.

    The problem that arises for me for systems (1) and (2) is the question about comparing their respective entropy values. (See my post #9.) If the values are the same, (which I tend to believe and will explain why in later posts) then how is it possible for the method of using finite ranges for position and velocity result in equal numbers of microstates for the two systems which have different values for temperature and volume?

    Regards,
    Buzz
     
    Last edited: Jun 14, 2018
  18. Jun 14, 2018 #17

    Buzz Bloom

    User Avatar
    Gold Member

    Hi jartsa:

    Thank you for your post. I do not mean to nit pick, but this example of a microstate introduces some factors that I think need to be taken into account. As I understand the Statistical Mechanics methods, each microstate has a probability of occurrence associated with it, and if the probability of a state is extremely small compared with others, then for the purposes of making predictions, this microstate can be ignored as insignificant. If my understanding is correct, then your example seems to be of this type.

    Regards,
    Buzz
     
  19. Jun 14, 2018 #18

    Well, every microstate has the same probability, for the same simple reason that every lottery result has the same probability.

    I claimed that all molecules packed together is a state with large entropy (when there is lot of available volume for the molecules). Well, this fluctuation theorem seems to have the view that that is a situation where the entropy fluctuated to a low value: https://en.wikipedia.org/wiki/Fluctuation_theorem.
     
    Last edited: Jun 14, 2018
  20. Jun 15, 2018 #19

    Grinkle

    User Avatar
    Gold Member

    I see. A process that does not change the entropy of the system is called "isentropic". Reversibility is one requirement and no heat transfer across the system boundary is another requirement. The no-heat-transfer part is "adiabatic". You sent me back to my old thermo textbook. If you Google isentropic process, you will get the background you are looking for I think.

    Edit: And I was meaning to say in my previous post that I don't think the two entropies are the same, sorry for the garbled answer. If you are still interested in that example after looking at isentropic on the internet, I will give you a more thought out response.
     
    Last edited: Jun 15, 2018
  21. Jun 15, 2018 #20

    Buzz Bloom

    User Avatar
    Gold Member

    Hi jartsa:

    I should have thought of that concept, but it skipped my mind. However, a conceptual difficulty still remains. Are there restrictions about the way microstates are counted for a particular choice of variable ranges. For example:
    https://en.wikipedia.org/wiki/Molecule#Molecular_size
    The smallest molecule is the diatomic hydrogen (H2), with a bond length of 0.74 Å.​
    https://en.wikipedia.org/wiki/Ångström
    [An] angstrom [Å] is a unit of length equal to 10−10 m.​
    Consider a cube with 1 meter edges. In angstrom units, the volume is 1030 Å3. Suppose we choose a position microstate for a particular H2 molecule as being a specific one of the 1030 small cubes each with a 1 Å edge. For sake of simplification, let us also assume that each small cube can hold in it's interior exactly one H2 molecule.

    https://en.wikipedia.org/wiki/Avogadro_constant
    The Avogadro constant ... has the value 6.022140857(74)×1023 mol−1
    Therefore one mole of H2 consists of approximately 12×1023 molecules.

    https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)
    The entropy S is defined as
    S = kB ln ⁡ Ω
    where kB is Boltzmann's constant and Ω is the number of microstates consistent with the given macrostate.
    This implies that the number of positional microstates would be (approximately)
    Ω = C(1030,12×1023)
    = 1030!/(12×1023! × (1030-12×1023)!)

    Unfortunately I do not have a tool to use to conveniently calculate this hypothetical approximate value of Ω. However, I hope you can tell me if the approach I have presented seems in principle to be correct.

    If this is OK, then what happens when we take a step with the process of making the granularity of the position microstates smaller. If we made the small cubes to have a side of 1/10 Å, the number of possible positions would increase by a factor of 1000, BUT now each possible location is too small to hold any H2 molecule. So, how does the concept of calculating the limit actually work as the size of the positional microstates converges to zero?

    Regards,
    Buzz
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted