1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B How can entropy increase in a deterministic universe

  1. Jan 12, 2017 #1
    Let's imagine a deterministic universe. A one where quantum mechanics simply doesn't apply. Ok.
    This was the universe of classical physics. Atoms exist, and they behave deterministically. Fine. Now, how can entropy increase in this universe, altough it has the same laws of physics. In a deterministic universe, the probability of all microstates is not equal, one is 100%, and the others are 0% chance. Since the universe is deterministic, that holds true. So the normal entropy explanation doesn't hold true. Then the question again rises, why does entropy increase and not decrease, if all microstates all not equally likelly?
     
  2. jcsd
  3. Jan 12, 2017 #2

    DrClaude

    User Avatar

    Staff: Mentor

    Since the particles can exchange energy, the microstate is not fixed.
     
  4. Jan 12, 2017 #3

    Demystifier

    User Avatar
    Science Advisor

    https://www.amazon.com/Entropy-Demystified-Second-Reduced-Common/dp/9812832254
     
    Last edited by a moderator: May 8, 2017
  5. Jan 12, 2017 #4
    How does that cause the entropy to go in the direction of increasing?
     
  6. Jan 12, 2017 #5

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Randomness, even within deterministic laws:

     
  7. Jan 12, 2017 #6
    How can randomness exist in deterministic laws, that is contradictory.
     
  8. Jan 12, 2017 #7

    stevendaryl

    User Avatar
    Staff Emeritus
    Science Advisor

    Here's the way I think about it:

    In classical physics, the state of the system is determined by a point in phase space. For each particle [itex]j[/itex], you give an initial position [itex]\vec{r_j}[/itex] and an initial momentum [itex]\vec{p_j}[/itex]. That takes 6 numbers for each particle, so phase space is 6N dimensional, where N is the number of particles. If you know the system's location in phase space at one time, then you know it at all future times, so there is no nondeterminism (in classical physics, at least).

    However, if you're dealing with [itex]10^{23}[/itex] particles, it's impossible to actually know the positions and locations of each particle. What you know instead is some coarse-grained state. For example, instead of knowing the location in phase space, you might know that the system is in some region [itex]R_1[/itex] in phase space. This region contains many different points in phase space. So suppose we ask the question: If we wait [itex]t[/itex] seconds, will the system be in a second region, [itex]R_2[/itex]? Well, some of the points in [itex]R_1[/itex] will make the transition to [itex]R_2[/itex] in that time, and some will not. So all we know is that the system is somewhere in region [itex]R_1[/itex], then we can make at best probabilistic predictions about whether the system will be in region [itex]R_2[/itex] at time [itex]t[/itex].

    So the coarse-graining gives the appearance of nondeterminism. Now, that would not be a problem if the spread in phase space remained small for all time (or for long periods of time). But for many systems of interest, the uncertainty of where the system is located in phase space grows rapidly with time, so that quickly you get into the situation where you don't know anything about where the system is in phase space other than that it is at some point with known values of the conserved quantities: total energy, total momentum, total angular momentum, total charge, total number of particles of various types.
     
  9. Jan 12, 2017 #8

    Demystifier

    User Avatar
    Science Advisor

    Did you ever flipped a coin? Was its behavior deterministic or random?
     
  10. Jan 12, 2017 #9

    Demystifier

    User Avatar
    Science Advisor

    Even with a small number of degrees of freedom one can loose apparent determinism when the system is sensitive to small changes in the initial conditions. That's how coin flipping works.
     
  11. Jan 12, 2017 #10

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    No. Here's another example. Imagine you have a large tray of dice (let's say just 10 dice to make it easier). They all start showing "1". Then you shake the tray. If you had a perfect deterministic theory you would be able to predict at any point what each die reads. You might predict at time ##t## you have:

    1-6-2-2-5-3-1-4-1-2

    And, a time later you would predict:

    2-4-5-5-1-6-3-3-2-5

    But, even though you have predicted this precisely (and let's say you are correct), regarding the values shown by the dice you have randomness. In other words, you deterministically predicted randomness.

    Or, perhaps to be slightly more precise, you have deterministically predicted disorder.
     
    Last edited: Jan 12, 2017
  12. Jan 12, 2017 #11

    Stephen Tashi

    User Avatar
    Science Advisor

    The answers to the OP, so far, are divided between two different explanations.
    1) Imprecise knowledge of the system's state(s) explains entropy increase
    2) Deterministic evolution of the systems micropscopic state results in a situation where the same macroscopic state variables describe more and more "possible" microscopic states.

    It seems to me that explanation 2) either needs to appeal to explanation 1) or it needs to make sure the macroscopic state variables are defined as averages over time or as averages over a collection of different systems. If a macroscopic state variable is defined as a function of the instantaneous microscopic state(s) of the individual particles in a system then the value of the macroscopic state (at time t) would not describe more than one "possible" state of the system if the "at time t" is included as part of the information in the macroscopic state. The "at time t" alone would specify only one possible state of a deterministic system whose initial condition was precisely known.
     
  13. Jan 12, 2017 #12

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I'm not sure I understand that, but to borrow Mr Feynman's example. If you have plain and blue dyed water kept separate, then allowed to mix, the mixture will always become a roughly even shade of blue everywhere. There are no initial conditions that would allow the water to remain in defined blue and non-blue segments.

    Chaotic behaviour makes a deterministic solution impossible. But, even if we assume we can determine the outcome, it is always a roughly even mixture.

    Or, to put it another way, even in a deterministic system, once the waters have started to mix, there is no going back.
     
  14. Jan 12, 2017 #13

    Stephen Tashi

    User Avatar
    Science Advisor

    The hypothesis in the original post is that we have a deterministic system. I think the OP intends to assume we know the initial conditions of the system. Of course, one way to answer the OP is to say that those assumptions are false. That seems to be what you assert. However, I'm curious if the OP's question can be answered if we assume those assumptions are true.

    Which definition of entropy are we using? If we use a definition of entropy that employs the concept of probability, we must force probability into the situation under consideration. I agree that Bayesian probability can be introduced into a deterministic situation by considering imperfect knowledge of initial conditions. Sensitivity of the trajectory of the system to initial conditions makes it attractive to introduce Bayesian probability into a model of the system.

    If we are going to exclude probability from the situation under consideration then (as far as I can see) we are talking about a measure of entropy defined in terms of the number of "possible" microstates that exist for a given value of some macrostate description of the system.

    It isn't clear to me whether in your post #10, you are asserting that we can't exclude probability from our model or whether your post #10 assumes a completely deterministic (and completely known) situation.
     
    Last edited: Jan 12, 2017
  15. Jan 12, 2017 #14

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    We can exclude both probability and chaos. If we take a simpler system along the same lines as the water. Perhaps 6 red and 6 blue cubes on a tray, initially arranged at rest with the blue on one side and the red on the other.

    Now, if we assume the tray is going to be moved according to some known but irregular pattern, then we can deterministically predict disorder, in the sense that we will end up with blue and red cubes mixed up.

    Without any unknowns or any chaotic behaviour we predict deterministically that disorder results.

    In a simple system you may be able to find precise sequences of movements that maintain order, but these movements have to chosen precisely to match the existing conditions. There are games that are based on this: trying to steer balls into holes by tilting the game in just the right way. But, unless the movements are coordinated with the current conditions, disorder results.

    In the water example, however, there would be no way to shake the container in order to tilt all the blue water to one side. There need be no randomness, per se, as any movement will only speed the disorder.
     
  16. Jan 12, 2017 #15

    Stephen Tashi

    User Avatar
    Science Advisor

    How are you defining the entropy of that system? If the entropy at time n (step n) is defined as a function of the state of the system at step n then there are certain configurations of the system that have the maximum entropy. Are you asserting that no matter what deterministic laws the system follows, it will always move from a state of maximum entropy to another state of maximum entropy? I see no reason to believe that.

    In order to get your model to work, you have to introduce the notion of disorderly-ness into how the cubes move. So you are showing that entropy increases with time in a deterministic system whose deterministic laws are "disorderly". Now, can we define "disorderly" in a non-circular way ? - i.e. can we define "disorderly" deterministic laws to be distinct from "those deterministic laws that cause entropy to increase" ?
     
  17. Jan 12, 2017 #16

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    We may be talking at cross purposes. To expand:

    By randomness I mean that an object may do one thing or another and what it will do under certain conditions is not known.

    By chaos I mean that the initial conditions cannot be determined accurately enough to predict what would happen to an object (especially after a passage of time where the state of the system has many interacting objects).

    The OP's point, I believe, is that if the laws of nature are deterministic there is no reason to suppose an evolution to a disordered state.

    In the water example, you could still have a completely deterministic model, with no randonmess or chaos. The movement of every water molecule is known and there is a completely deterministic outcome.

    It would be possible, in theory, to specify initial conditions where the water did not mix, but these would have to be very precisely coordinated. Left to itself, and totally deterministically, (almost) any initial conditions for the water results in a mixing of the blue. There is no need to assume randomness or chaos for this. It happens naturally and deterministically for (almost) any initial conditions.

    Another example is mixing hot and cold water. Even with everything known and deterministic, it is impossible to specify initial conditions where the temperature did not equalise. You can't specify initial conditions for a litre of hot water and a litre of cold water and a precise, coordinated means of mixing them, even molecule by molecule, which does not result in temperature equalisation.

    This aligns with the definition of entropy as a measure of thermodynamic equilibrium. Even without chaos or randonmess, thermodynamic equilibrium naturally and determkinistically results.
     
  18. Jan 12, 2017 #17

    Stephen Tashi

    User Avatar
    Science Advisor

    And it would be possible to have deterministic laws where the water mixed and then unmixed - analogous to how repeated "perfect shuffles" bring a deck of cards back to its original state. (I think it would also be possible to have probabilistic laws where this has some probability of happening.)

    That is empirically true and the OP doesn't appear to be a second law doubter.

    (Your reference to "almost any" seems to introduce probability into the picture - as if we are no longer considering a deterministic physical system with given initial conditions, but instead are "randomly selecting" some initial conditions from a set of possible initial conditions.)

    Can the OP's question be answered in any non-empirical way? Can we give an answer of the form: "Yes, some mathematically possible deterministic laws would result in an up-and-down variation of entropy with respect to time, but the deterministic followed by nature are such that ..[?]... and, as a consequence, the entropy of real physical systems does not decrease with respect to time". ?

    I agree - since the deterministic laws of heat transfer tell the system to equalize the temperature. This is what I mean by a circular explanation of how determinism can lead to an increase in entropy. If we assert that the deterministic laws of nature are defined by rules that tell them to increase entropy, then they obviously increase entropy.
     
  19. Jan 12, 2017 #18

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The deterministic laws don't know they are increasing entropy: when two molecules collide they follow some deterministic law of molecular collision. What is not possible is to use these laws of molecular collision to avoid heat transfer. You would need different laws than conservation of energy and momentum. You would need a law that meant when two particles collided the one with the least energy would lose energy and the one with more energy would gain energy (or, at least, that no energy exchange was possible).

    But, the simple laws governing particle collisions are not in themselves defined by increasing entropy. Most obviously, if one particle is at rest and another is moving, there is no law of collison that can maintain this. Any collision must result in a gain of energy for the at-rest particle and a loss of the moving particle.

    My suggestion is to watch the Feynman lecture.
     
  20. Jan 12, 2017 #19

    Stephen Tashi

    User Avatar
    Science Advisor

    Then, returning to my question:

    Can we give an answer of the form: "Yes, some mathematically possible deterministic laws would result in an up-and-down variation of entropy with respect to time, but the deterministic laws followed by nature are such that ..[?]... and, as a consequence, the entropy of real physical systems does not decrease with respect to time". ?


    I agree that your suggestion:
    is sufficient to decrease entropy, but couldn't other types of deterministic laws could do it? For example, there could be laws where there was no general policy about which particle gained or lost energy. There could be a deterministic result, but a result that depended on each possible collision.

    A possible answer to the OP is:

    Yes, some mathematically possible deterministic laws would result in an up-and-down variation of entropy with respect to time, but the deterministic laws followed by nature are such that the distribution of kinetic energy in the component parts of a system equalizes as time passes and, as a consequence, the entropy of real physical systems does not decrease with respect to time".

    To refine that explanation we would have to be clear about the relation between "entropy" of a system and "the distribution of kinetic energy" over that system.
     
  21. Jan 12, 2017 #20

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I was only really making the point that although we have uncertainty at the quantum level and chaos at the macro level, neither of these are necessary for the increase in entropy.

    Quite what laws of nature would be required to prevent the increase in entropy I'm not sure.

    The other aspect, which is perhaps more important than the laws of particle collisions, would be that thermally different substances would need to be physically kept apart. In the water example it's not enough that individual molecules retain their energy but that something prevents high energy and low energy particles from getting mixed up.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted