Are Boltzman's statistics compatible with a deterministic universe?

In summary, Boltzmann's statistics are compatible with a deterministic universe. However, his model is based on probability theory and has been rendered obsolete by modern computer simulations of ideal gas, which show that the laws of motion are in fact irreversible. Boltzmann believed that nature tends to go from less probable to more probable states, but this is only true for macrostates that are far from the most probable state. Probability theory, according to Boltzmann, does not determine whether nature is deterministic or not. The question of whether probability is intrinsic to quantum mechanics is still under debate.
  • #1
Physicist248
14
4
Are Boltzman's statistics compatible with deterministic universe? Suppose that the gas molecules in a given container are perfectly elastic objects obeying Newton's laws. Suppose further that we select the initial conditions (impulse and position of each molecule) at random. Is it true that, if the initial conditions are randomly selected, then in most cases the entropy will be increasing (but in some cases it will not)? Or do Boltzman's statistics imply inherent indeterminism/uncertainty at some level, i.e. the quantum level?
 
  • Like
Likes Bill Dreiss
Science news on Phys.org
  • #2
Physicist248 said:
Is it true that, if the initial conditions are randomly selected, then in most cases the entropy will be increasing (but in some cases it will not)?
Yes. In fact, the process is time symmetric, which means that in principle you could reverse any change. But, the probabilities in favour of increasing entropy are overwhelming
 
  • #3
Boltzmann statistics is a model of a model, the underlying model being kinetic theory, which is a deterministic model based on Newton’s laws of motion. In Boltzmann’s day, solving the equations of motion for even a few molecules was intractable, so the statistical model was devised as a substitute. However, since the advent of computers, kinetic theory has been instantiated in ideal gas simulations, a number of which can be found online. These simulations render Boltzmann’s model obsolete, since they allow us to experimentally explore the properties of an ideal gas directly and compile statistics instead of relying on abstract probabilities. They also mimic the second law behavior that we experience every day, demonstrating that the laws of motion are irreversible, contrary to common belief.

Boltzmann believed that “In nature, the tendency of transformations is always to go from less probable to more probable states”. However, according to his model this is only true for macrostates that are relatively far from the most probable macrostate. For macrostates that are nearer the most probable macrostate, it is more probable that the next macrostate will be less probable than the current one. For instance, if the system is in the most probable macrostate, the probability that the next macrostate will be less probable approaches one for large systems. This is because the probability of being in the most probable macrostate simultaneously approaches zero.

Boltzmann statistics is based on probability theory, which quantifies our subjective degree of belief or ignorance. From this perspective, probability is not a property of nature, as Boltzmann claimed, and probability theory has no bearing on whether nature is deterministic or not. On the question of whether probability is intrinsic to quantum mechanics, the jury is still out.

For details see [Link to blog with Personal Speculation redacted by the Moderators]
 
Last edited by a moderator:
  • Like
Likes dextercioby
  • #4
PeroK said:
Yes. In fact, the process is time symmetric, which means that in principle you could reverse any change. But, the probabilities in favour of increasing entropy are overwhelming
OK, has somebody actually proved it? Is this a theorem of physics?
 
  • Like
Likes jbergman
  • #5
Bill Dreiss said:
Boltzmann statistics is a model of a model, the underlying model being kinetic theory, which is a deterministic model based on Newton’s laws of motion. In Boltzmann’s day, solving the equations of motion for even a few molecules was intractable, so the statistical model was devised as a substitute. However, since the advent of computers, kinetic theory has been instantiated in ideal gas simulations, a number of which can be found online. These simulations render Boltzmann’s model obsolete, since they allow us to experimentally explore the properties of an ideal gas directly and compile statistics instead of relying on abstract probabilities. They also mimic the second law behavior that we experience every day, demonstrating that the laws of motion are irreversible, contrary to common belief.

Boltzmann believed that “In nature, the tendency of transformations is always to go from less probable to more probable states”. However, according to his model this is only true for macrostates that are relatively far from the most probable macrostate. For macrostates that are nearer the most probable macrostate, it is more probable that the next macrostate will be less probable than the current one. For instance, if the system is in the most probable macrostate, the probability that the next macrostate will be less probable approaches one for large systems. This is because the probability of being in the most probable macrostate simultaneously approaches zero.

Boltzmann statistics is based on probability theory, which quantifies our subjective degree of belief or ignorance. From this perspective, probability is not a property of nature, as Boltzmann claimed, and probability theory has no bearing on whether nature is deterministic or not. On the question of whether probability is intrinsic to quantum mechanics, the jury is still out.

For details see [Link to blog with Personal Speculation redacted by the Moderators]
"They also mimic the second law behavior that we experience every day, demonstrating that the laws of motion are irreversible"
Why would they be irreversible if the system is deterministic?
 
Last edited by a moderator:
  • Like
Likes jbergman
  • #6
Physicist248 said:
OK, has somebody actually proved it? Is this a theorem of physics?
It's inherent in classical particle collisions and scattering. Every collision is equally valid in reverse. There is simply nothing inherent in the classical laws of motion that indicates time irreversibility.

It's only when you consider the probabilitistic aspect that you understand why processes are statistically irreversible (not absolutely irreversible).
 
  • Like
Likes Lord Jestocost and vanhees71
  • #7
Bill Dreiss said:
On the question of whether probability is intrinsic to quantum mechanics, the jury is still out.
It's true that there are people who will never accept QM. But, mainstream QM is inherently probabilitistic. You may as well say the jury is still out on whether the Earth is flat!
 
  • Like
Likes Lord Jestocost and vanhees71
  • #8
Physicist248 said:
"They also mimic the second law behavior that we experience every day, demonstrating that the laws of motion are irreversible"
Why would they be irreversible if the system is deterministic?
In section 52-2 of The Feynman Lectures on Physics, Vol. 1, Feynman says, “Next we mention a very interesting symmetry which is obviously false, i.e., reversibility in time. The physical laws apparently cannot be reversible in time, because, as we know, all obvious phenomena are irreversible on a large scale…” [my italics]. He continues with “So far as we can tell, this irreversibility is due to the very large number of particles involved, and if we could see the individual molecules, we would not be able to discern whether the machinery was working forward or backwards.” However, by attributing irreversibility to “the very large number of particles involved”, he begs the question: At what number of particles does a system switch from reversible to irreversible? But close observation of deterministic ideal gas simulations reveals that the second-law behavior holds even for a handful of molecules, in fact, for any number of molecules ≥ 2.

In section 52-4, he points out the mirror symmetry of the linear momenta in the direction of the three Cartesian coordinates and the angular momentum in two of the directions of spherical coordinates. He calls the third component of spherical motion the polar vector and illustrates mirror symmetry for this vector in Fig. 52-2, showing that a vector pointed in the northeast direction converts to a vector pointed in the northwest direction when rotated 180 degrees around the y axis. However, this is a clear misapplication of mirror symmetry, since the other mirror reversals are rotated around an axis perpendicular to the vector of motion. Therefore, mirror symmetry applied to the polar vector should rotate around an axis perpendicular the direction of the polar vector, resulting in a vector pointed toward the southwest, not northwest.

If the direction of motion is reversed from the positive direction (away from the origin) the direction will be negative (toward the origin) only until the particle reaches the origin, at which time it will become positive again. Since the final direction of motion is the same as the initial direction of motion, the process is irreversible. This is the source of the asymmetry in the laws of motion and the physical basis of the second law. Furthermore, it is apparent that the second law is not fundamental in and of itself, but an epiphenomenon of the more basic law of inertia.

For a more detailed discussion, see [Link to blog with Personal Speculation redacted by the Moderators]
 
Last edited by a moderator:
  • #9
Physicist248 said:
OK, has somebody actually proved it? Is this a theorem of physics?
No, it's a theorem of mathematics, an analogy with no causal connection to physics.
 
  • #10
PeroK said:
It's true that there are people who will never accept QM. But, mainstream QM is inherently probabilitistic. You may as well say the jury is still out on whether the Earth is flat!
For evidence that this is a live issue, see Whitaker, Andrew. Einstein, Bohr and the Quantum Dilemma, Cambridge University Press.
 
  • #11
Bill Dreiss said:
For evidence that this is a live issue, see Whitaker, Andrew. Einstein, Bohr and the Quantum Dilemma, Cambridge University Press.
Well, Andrew Whitaker may be alive, but Einstein and Bohr are definitely long dead!
 
  • Haha
Likes berkeman
  • #12
Thread closed temporarily for Moderation...
 
  • #13
After some cleanup, thread is reopened.
 
  • #14
PeroK said:
Well, Andrew Whitaker may be alive, but Einstein and Bohr are definitely long dead!
I'm not sure how this is relevant, since their ideas are still discussed by modern physicists. While the general belief currently seems to be that Bohr was arguing in favor of an ontic interpretation of quantum probability and Einstein was arguing in favor of an epistemic interpretation, this is not true. Both recognized the epistemic nature quantum probability. It's just that Bohr thought that the uncertainty principle rendered attempts to look behind the curtain futile, while Eintein felt that if there was something there, aka hidden variables, you wouldn't find it if you didn't look.

This debate required both men to distinguish between the ontic and epistemic, a distinction that is generally lost on modern physicists. For examples of the modern conflation of the two concepts, see Lost in Math and the recent Existential Physics, both by Sabine Hossenfelder. For an article that gets to the heart of the issue of quantum probability, see page 7 of the attached file.
 

Attachments

  • Jaynes.cmystery.pdf
    266.7 KB · Views: 91
  • #15
Bill Dreiss said:
This debate required both men to distinguish between the ontic and epistemic, a distinction that is generally lost on modern physicists.
Well, the alternative view is that the modern physicist has 60-70 years of experimental evidence and theoretical development over Einstein and Bohr

I don't buy into the concept that the understanding of QM has been lost in recent decades. Quite the reverse, in fact.
 
  • Like
Likes DrClaude and vanhees71
  • #16
PeroK said:
It's inherent in classical particle collisions and scattering. Every collision is equally valid in reverse. There is simply nothing inherent in the classical laws of motion that indicates time irreversibility.

It's only when you consider the probabilitistic aspect that you understand why processes are statistically irreversible (not absolutely irreversible).
I have realized that if the initial conditions were chosed at random we would be in a state of maximum entropy to begin with. I suppose that it is still possible that the universe was created with uneven distribution of thermal energy, and within this framework the microstates were chosen randomly. It seems far fetched. Maybe cosmology explains it somehow.

To me this argument that the processes are reversible in principle but not statistically does not seem convincing. God could have just as easily put everything in reverse motion. Then Boltzman's statistic and gas simulation would indicate increasing entropy while in real word we would observe decreasing entropy.

By this thought experiment I am trying to say that it is unlikely the processes are reversible in principle. Do not know if inherent randomness applies to gas molecules but the uncertainty principle does.
 
  • Skeptical
Likes PeroK
  • #17
PeroK said:
Well, the alternative view is that the modern physicist has 60-70 years of experimental evidence and theoretical development over Einstein and Bohr

I don't buy into the concept that the understanding of QM has been lost in recent decades. Quite the reverse, in fact.
I didn't say that "the understanding of QM has been lost", but rather that the general ability of physicists to "distinguish between the ontic and epistemic" has. However, while the experimental evidence of QM continues to accumulate, it doesn't appear to me that theory has come that far. If you're aware of a source that goes significantly beyond Bohm, de Broglie, Feynman and Bell, please advise me. Wikipedia doesn't seem to know.
 
  • #18
Physicist248 said:
I have realized that if the initial conditions were chosed at random we would be in a state of maximum entropy to begin with. I suppose that it is still possible that the universe was created with uneven distribution of thermal energy, and within this framework the microstates were chosen randomly. It seems far fetched. Maybe cosmology explains it somehow.

To me this argument that the processes are reversible in principle but not statistically does not seem convincing. God could have just as easily put everything in reverse motion. Then Boltzman's statistic and gas simulation would indicate increasing entropy while in real word we would observe decreasing entropy.

By this thought experiment I am trying to say that it is unlikely the processes are reversible in principle. Do not know if inherent randomness applies to gas molecules but the uncertainty principle does.
This is a science forum, so you are expected to try to understand the science and mathematics, rather than dismiss them in favour of personal preference or religious arguments.

The principle of time reversibility of the basic laws, but statistical irreversibility is simple to simulate on a computer. Your scepticism therefore, is unfounded.
 
  • #19
PeroK said:
This is a science forum, so you are expected to try to understand the science and mathematics, rather than dismiss them in favour of personal preference or religious arguments.

The principle of time reversibility of the basic laws, but statistical irreversibility is simple to simulate on a computer. Your scepticism therefore, is unfounded.
Just play your simulation backwards.
 
  • #20
Physicist248 said:
Just play your simulation backwards.
That's the point. Each step is reversible, but the process overall shows irreversible behaviour.

The essence of science when you encounter something hard to understand is to think. Not to stop thinking.

Anyone can stop thinking. There's nothing clever in that.
 
  • Like
Likes berkeman
  • #21
PeroK said:
That's the point. Each step is reversible, but the process overall shows irreversible behaviour.

The essence of science when you encounter something hard to understand is to think. Not to stop thinking.

Anyone can stop thinking. There's nothing clever in that.
Just flip the sign of all the velocities. Why are "backward" velocities less probable than "forward" velocities?
 
  • #22
Bill Dreiss said:
Boltzmann statistics is a model of a model, the underlying model being kinetic theory, which is a deterministic model based on Newton’s laws of motion. In Boltzmann’s day, solving the equations of motion for even a few molecules was intractable, so the statistical model was devised as a substitute. However, since the advent of computers, kinetic theory has been instantiated in ideal gas simulations, a number of which can be found online. These simulations render Boltzmann’s model obsolete, since they allow us to experimentally explore the properties of an ideal gas directly and compile statistics instead of relying on abstract probabilities. They also mimic the second law behavior that we experience every day, demonstrating that the laws of motion are irreversible, contrary to common belief.

Boltzmann believed that “In nature, the tendency of transformations is always to go from less probable to more probable states”. However, according to his model this is only true for macrostates that are relatively far from the most probable macrostate. For macrostates that are nearer the most probable macrostate, it is more probable that the next macrostate will be less probable than the current one. For instance, if the system is in the most probable macrostate, the probability that the next macrostate will be less probable approaches one for large systems. This is because the probability of being in the most probable macrostate simultaneously approaches zero.

Boltzmann statistics is based on probability theory, which quantifies our subjective degree of belief or ignorance. From this perspective, probability is not a property of nature, as Boltzmann claimed, and probability theory has no bearing on whether nature is deterministic or not. On the question of whether probability is intrinsic to quantum mechanics, the jury is still out.

For details see [Link to blog with Personal Speculation redacted by the Moderators]
How do these simulations work? The molecules are points or spheres? They all have the same mass, and you randomly select position and velocity for each? And they will converge to higher entropy? Just flip the signs of the velocities, and entropy will decrease. "Backward" velocities are just as probable as "forward" velocities. Something does not add up. Or am I missing something?
 
  • #23
Physicist248 said:
Or am I missing something?
A classic example is mixing a pack of cards. If you start with a unmixed deck, then almost any sequence of mixing will jumble the cards. It is possible, of course, to mix cards and end up back where you started. But, this is statistically extremely unlikely - unless you are deliberating choosing your movements to this end.

Also, once you have a mixed deck and keep mixing, the deck stays mixed. It doesn't return to an unmixed deck. This, however, is purely statistical. With a small number of cards, you would get back to an unmixed deck with some significant probability.

Let's reduce mixing to a sequence of swapping any two cards. If you show a video of someone mixing a deck, you cannot tell by any particular swap whether the process is running forwards or backwards. Every swap is reversible. And, if you have a mixed deck (equilibrium), then again you cannot tell whether the video showing the sequence of swaps is running forwards or backwards.

But, if you have a video of a mixed deck being shuffled and returing to an unmixed deck, then you know (statistically) that the film is running backwards.

It's even more apparent if, instead of 52 cards, you have ##52 \times 10^{23}## cards. For example, the statistical likelihood of warm water separating into hot and cold water is effectively impossible. Even though every individual exchange of kinetic energy between molecules is reversible.

And, of course, when you mix hot and cold water it's statistically inevitable that you end up with warm water.
 
Last edited:
  • #24
Bill Dreiss said:
I didn't say that "the understanding of QM has been lost", but rather that the general ability of physicists to "distinguish between the ontic and epistemic" has. However, while the experimental evidence of QM continues to accumulate, it doesn't appear to me that theory has come that far. If you're aware of a source that goes significantly beyond Bohm, de Broglie, Feynman and Bell, please advise me. Wikipedia doesn't seem to know.
It's irrelevant for physics to distinguish between some subtle philosophical notions.

All that counts is the ability to apply a mathematical theory or model to the quantitative observations of nature, usually conducted in experiments where one investigates some aspect of nature in a controlled environment to enable the empirical test of that theory or model. I'd say these "issues" are simply resolved today with all the investigations about it with various systems (mostly photons but also neutrons, atoms, molecules, and in condensed matter (quantum dots), etc. etc.).
 
  • Like
Likes weirdoguy
  • #25
Physicist248 said:
How do these simulations work? The molecules are points or spheres? They all have the same mass, and you randomly select position and velocity for each? And they will converge to higher entropy? Just flip the signs of the velocities, and entropy will decrease. "Backward" velocities are just as probable as "forward" velocities. Something does not add up. Or am I missing something?
For an example, see https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html. The molecules are identical circles in 2-dimensional space. The initial conditions usually have the molecules bunched in a small region of the box. They then spontaneously disperse, increasing entropy. The simulation is strictly deterministic and probability is not involved. Since the algoritm instantiates only Newton's three deterministic laws of motion, the second-law dispersion can only be a consequence of these laws.

Which particular law is behind the dispersion? The third law, which relates to collisions, is clearly reversible. The second law, which relates to force, is not relevant since the ideal gas model assumes that no external forces or forces between the molecules exist. This leaves the first law, the law of inertia. The action of inertia is clearly visible as the source of the dispersion behind the second law.

If the simulation is reversed, the molecules will retrace their paths back to the initial conditions, temporarily decreasing entropy. However, if the simulation continues to run, the molecules will once again disperse, increasing entropy. The direction of increasing spontaneous dispersion is the direction of time.
 
  • Skeptical
Likes weirdoguy and PeroK
  • #26
Bill Dreiss said:
The action of inertia is clearly visible as the source of the dispersion behind the second law.
Which is clearly nonsense. It's the initial conditions and laws of statistical the produce dispersion.
 
  • Like
Likes PeterDonis and vanhees71
  • #27
Behind the second law is detailed balance or, on a fundamental level, the unitarity of the S-matrix together with some "coarse graining procedure". In the case of the standard derivation of the Boltzmann equation the latter comes in via the molecular-chaos assumption ("Stoßzahlansatz").
 
  • #28
PeroK said:
Which is clearly nonsense. It's the initial conditions and laws of statistical the produce dispersion.
From Boltzmann, Ludwig. Lectures on Gas Theory (Dover Books on Physics) (p. 59):

“It is only when one reverses the velocities at time t1 that he obtains a motion for which H must increase during the time interval t1 – t0, and even then H would probably decrease again after that…” [my italics]

I’m using the same logic as Boltzmann. Do you see a flaw in our reasoning?
 
  • #29
Bill Dreiss said:
I’m using the same logic as Boltzmann. Do you see a flaw in our reasoning?
Bolzmann never concluded that Newton's first law (or the law of inertia) was the cause of entropy. Your reasoning is your own.
 
  • Like
Likes vanhees71
  • #30
PeroK said:
Bolzmann never concluded that Newton's first law (or the law of inertia) was the cause of entropy. Your reasoning is your own.
I've merely followed through with Boltzmann's and Feynman's (Post #8) observations. What is the flaw in our line of reasoning?
 
  • #31
Bill Dreiss said:
I've merely followed through with Boltzmann's and Feynman's (Post #8) observations. What is the flaw in our line of reasoning?
There is no logic whatsoever in what you say. The first law is just as time-reversible as the other two. It's called statistical mechanics, not inertial mechanics.
 
  • Like
Likes PeterDonis
  • #32
PeroK said:
There is no logic whatsoever in what you say. The first law is just as time-reversible as the other two. It's called statistical mechanics, not inertial mechanics.
My observations involve kinetic theory, which preceded statistical mechanics. Feynman's line of reasoning was identical to mine and he would have reached the same conclusion as I did if he had not made a careless mistake. The crutial observation which leads to this conclusion is that if the velocities are reversed, the dispersion will resume after the initial positions are recovered. If you disagree, please explain why. Just telling me that I'm wrong gets us nowhere.
 
  • #33
Bill Dreiss said:
My observations involve kinetic theory, which preceded statistical mechanics. Feynman's line of reasoning was identical to mine and he would have reached the same conclusion as I did if he had not made a careless mistake.
It takes a special type of mind, I guess, to believe something like that.
 
  • #34
Bill Dreiss said:
If you disagree, please explain why. Just telling me that I'm wrong gets us nowhere.
There is nowhere to go with personal theories on PF. I've already indulged you more than I should.
 
  • Like
Likes vanhees71
  • #35
Bill Dreiss said:
For an example, see https://phet.colorado.edu/sims/html/gas-properties/latest/gas-properties_en.html. The molecules are identical circles in 2-dimensional space. The initial conditions usually have the molecules bunched in a small region of the box. They then spontaneously disperse, increasing entropy. The simulation is strictly deterministic and probability is not involved. Since the algoritm instantiates only Newton's three deterministic laws of motion, the second-law dispersion can only be a consequence of these laws.

Which particular law is behind the dispersion? The third law, which relates to collisions, is clearly reversible. The second law, which relates to force, is not relevant since the ideal gas model assumes that no external forces or forces between the molecules exist. This leaves the first law, the law of inertia. The action of inertia is clearly visible as the source of the dispersion behind the second law.

If the simulation is reversed, the molecules will retrace their paths back to the initial conditions, temporarily decreasing entropy. However, if the simulation continues to run, the molecules will once again disperse, increasing entropy. The direction of increasing spontaneous dispersion is the direction of time.
"If the simulation is reversed, the molecules will retrace their paths back to the initial conditions, temporarily decreasing entropy. However, if the simulation continues to run, the molecules will once again disperse, increasing entropy." Hmm ...
 

Similar threads

Replies
16
Views
865
Replies
3
Views
705
  • General Math
Replies
28
Views
2K
  • Thermodynamics
Replies
1
Views
961
Replies
62
Views
13K
  • Classical Physics
Replies
6
Views
1K
Replies
4
Views
2K
Replies
75
Views
8K
Replies
5
Views
854
  • Thermodynamics
Replies
25
Views
7K
Back
Top