I What does equiprobable mean in the context of thermal motion?

  • Thread starter Thread starter Mike_bb
  • Start date Start date
Click For Summary
The discussion centers on the isotropy of molecular motion in gases, asserting that the probability of molecules moving in any direction (x, y, or z) is equal. This implies that the average velocity components in each direction can sum to zero, but individual velocities do not necessarily cancel each other out. The concept of equiprobability is highlighted, suggesting that while the average may be zero, not every velocity has a corresponding opposite. Misinterpretations of diagrams depicting molecular motion are also addressed, clarifying that they should not imply equal and opposite velocities for every molecule. Overall, the conversation emphasizes the statistical nature of molecular velocity distributions in thermodynamics.
  • #91
Mike_bb said:
... As is mentioned above: for example, in X-axis in negative direction we can have -6 and in positive direction we can have 1,2,3 (average is 0). But if we use integral from ##-\infty## to ##+\infty## then we obtain that every positive component in X-axis has opposite (negative) component (1,2,3) and (-1,-2,-3). How is it possible? Thanks.

View attachment 364741
@Mike_bb, can I throw this in...

Suppose you have ten 6-sided dice, each marked with the values:
-3, -2, -1, +1, +2, +3.

You throw them.

A = the number of -3s
B = the number of +3s.

You will probably find ##\frac AB## is not 1.

But if you have ##10^{23}## dice you will find that ##\frac AB## is almost exactly equal to 1. It’s a result of the statistical behaviour when the number of dice is very large.

The same idea applies to x-components of velocity for particles in a gas.

If you haven’t yet learned about and understood continuous probability distributions, you should probably (IMO) avoid the calculus-based analysis.

Edit - typo's.
 
  • Like
Likes Mike_bb and jbriggs444
Science news on Phys.org
  • #92
jbriggs444 said:
But what are you integrating over?
I'm integrating over continuous distribution. (components of velocities on X-axis)
The result is ##\frac{kT}{m}##. This result coincide with ##<V_x^2>## (finite large sample). How is it possible?

jbriggs444,

Am I right that for every particle with positive component on X-axis we can find particle with opposite (negative) component? If it's so then I really understand how it works.

Google AI:

11111.webp
 
Last edited:
  • #93
Mike_bb said:
I'm integrating over continuous distribution. (components of velocities on X-axis)
The result is ##\frac{kT}{m}##. This result coincide with ##<V_x^2>## (finite large sample). How is it possible?
How is it possible? Why should it not be possible? Why the surprise?

It is very difficult to respond to an open ended question like this. You need to be specific about your concerns. Or, as has been aptly suggested, use a textbook and take a course.
Mike_bb said:
jbriggs444,

Am I right that for every particle with positive component on X-axis we can find particle with opposite (negative) component? If it's so then I really understand how it works.
No.

If we choose a frame of reference where the total momentum is zero (i.e. the wind is not blowing) and assume a gas where all the particles are identical (so that momentum matches velocity), all we get is that the sum of the velocities in any direction is zero. Not that the velocities are paired up evenly.

You've given an example yourself: { -6, 1, 2, 3 }.
Mike_bb said:
Google AI:
Is not a valid reference here.

For a very large sample from a normal distribution with mean zero, we do get that the number of particles in a small velocity range: ##(-v \pm \Delta v)## is approximately equal to the number of particles in the symmetric velocity range: ##(+v \pm \Delta v)##.

We do not get a guarantee of exact equality. Nor do we need one for practical purposes.

Of course, the normal distribution is not special in this respect. The same property will hold for any symmetric distribution.

I would suggest a course in statistics. You need help nailing down concepts like "distribution", "sample", "mean" and "variance".
 
  • Like
Likes Mike_bb and Steve4Physics
  • #94
I'd like to add this to what @jbriggs444 has said.

Mike_bb said:
Am I right that for every particle with positive component on X-axis we can find particle with opposite (negative) component?
No. You are wrong! Try working through the following carefully.

Step-1

Question: what is the probability of a gas particle having a speed of exactly 50.1 m/s?

You might say I haven’t given you enough information – but I have!

The answer is zero!

That’s because ‘exactly 50.1 m/s’ means
50.100000000000000000000000000… (to infinitely many decimal places)

The probability of a speed being exactly equal to this value (to the infinite-th decimal place) is zero.

That’s maths, not physics. It’s an issue for any continuous variable, not just speed.
______________________
Step-2

How do we deal with this? Answer: we break-down our continuous variable into convenient intervals. We can then ask: what is probability of a gas particle having a speed in some interval, e.g. ## v## to ##v+\delta v##.

E.g. Using m/s for speed, our intervals could be:
.
##49 \le v \lt 50##
##50 \le v \lt 51##
##51 \le v \lt 52##
.
Here ##\delta v = 1##.

We can how ask what is the probability of a particle's speed being in the range 50. to 51. The answer will be some sensible value.
_________________

Step-3 (which may answer your question)

Say:
P gas particles have x-components of velocity in the range ##50 \le v_ x \lt 51##
Q gas particles have x-components of velocity in the range## -51 \le v_ x\lt -50##

Does P = Q? In fact it’s mathematically better to ask: does ##\frac PQ = 1##?

For a small number of particles, it is unlikely that ##\frac PQ = 1##.

But for a large number of particles (e.g. ##10^{23}##) ##\frac PQ## is almost exactly equal to 1.

This is the result of the statistical behaviour when the number of particles is very large.

This is a bit of an oversimplification but it illustrates the principle.
 
  • Like
Likes Mike_bb and jbriggs444
  • #95
jbriggs444,

Ok. Is it right that if we have particle with X-component 4 then we can find particle with Y-component 4? (Google AI says it's true)

You wrote:
"If we choose a frame of reference where the total momentum is zero (i.e. the wind is not blowing) and assume a gas where all the particles are identical (so that momentum matches velocity), all we get is that the sum of the velocities in any direction is zero. Not that the velocities are paired up evenly."

I didn't say that sum of the velocities in any direction is zero. For example, two vectors: ##V_1(1,5)## and ##V_2(-1,7)## Why not?
 
Last edited:
  • #96
Mike_bb said:
jbriggs444,

Ok. Is it right that if we have particle with X-component 4 then we can find particle with Y-component 4? (Google AI says it's true)
1. Google AI is not an acceptable reference. [See the rules here. Search for "ChatGPT and AI-generated text"]
2. If Google AI says this then Google AI is wrong. AI is frequently wrong but plausible. Hence the rule above.
3. I do not believe that Google AI says this. However since Google AI is not a valid reference here, that matter is not available for discussion.
4. [For the pedants out there] An implication that starts with a false premise will always be logically true. If false then anything. As has been pointed out, the likelihood of a particle with an X component of 4 being selected from a normal distribution is zero. We can find a mathematical formula for this:$$\int_4^4 \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}} dx = 0$$
If you can find a valid reference that agrees with what you are saying, please link to it.

Mike_bb said:
You wrote:
"If we choose a frame of reference where the total momentum is zero (i.e. the wind is not blowing) and assume a gas where all the particles are identical (so that momentum matches velocity), all we get is that the sum of the velocities in any direction is zero. Not that the velocities are paired up evenly."

I didn't say that sum of the velocities in any direction is zero. For example, two vectors: ##V_1(1,5)## and ##V_2(-1,7)## Why not?
I am not sure what point you are trying to make here.

Edit. Maybe I get it. You are suggesting that the velocity in the x direction can be independent of a wind blowing in the y direction?

1. Two [two dimensional] velocities that you have made up in your head are the same thing as four random samples from a normal distribution.
2. For any actual random sample, the mean for the sample will usually be different from the mean of the distribution.
3. If we are playing fair, we are working with a gas at rest rather than a gas in a wind tunnel. So we are arranging [within experimental limits] for the distribution to have a mean of zero in all directions. Or, equivalently, we are arranging for the total momentum to be zero.
4. [For the pedants out there] If we imagine an ideal gas of ##10^{23}## particles with a total momentum of zero and imagine that it is modelled with perfect mathematical accuracy as [per dimension] ##10^{23}## random samples from a normal distribution, we will be off by one. There are only ##10^{23}-1## degrees of freedom available. When we specified a total momentum of zero, that cost us one degree of freedom [per dimension]. We can have at most ##10^{23}-1## independent samples. Good luck measuring this discrepancy with a physical experiment.
 
Last edited:
  • Like
Likes berkeman and Mike_bb
  • #97
jbriggs444,

I opened another russian book and found there (translation):

"Strictly speaking, the average value of any velocity component is zero, since it can be positive or negative with equal probability."

It strange to me because I found in all sources that there are particle with positive and particle with negative component.

Could anyone provide source where it's written the opposite?
 
  • #98
Mike_bb said:
jbriggs444,

I opened another russian book and found there (translation):

"Strictly speaking, the average value of any velocity component is zero, since it can be positive or negative with equal probability."
I read this as "given any distribution which is symmetric about zero, the distribution mean will be zero".

That is a true statement about the distribution. Nonetheless, the following statement is false:

"Any finite sample drawn from a symmetric distribution will have a sample mean of zero"

An example is a fair coin with +1 stamped into one face and -1 stamped into the other. We can immediately see that the distribution mean is zero since the positive and negative results are equiprobable. However, if we flip this coin 100 times, the total over those throws is unlikely to be zero. [Google says roughly 8% probability of 50 heads and 50 tails]
Mike_bb said:
It strange to me because I found in all sources that there are particle with positive and particle with negative component.
I see no strangeness. Yes, any large sample from any distribution which is symmetric about zero is overwhelmingly likely to have both positive and negative elements. The probability that all elements share the same sign is one in ##2^{N-1}##. [Pedant point - provided that p(0) = 0].

For a one element sample, it is 100% certain that the single element will share the same sign with every other element.

For a two element sample, it is 50% certain.

For a three element sample, it is 25% certain.

For a ##10^{23}## element sample, it is practically impossible.

So what?
 
Last edited:
  • #99
jbriggs444,

I agree with this statement:

"Yes, any large sample from any distribution which is symmetric about zero is overwhelmingly likely to have both positive and negative elements.
 
  • #100
jbriggs444,

So what?

Your statement "Yes, any large sample from any distribution which is symmetric about zero is overwhelmingly likely to have both positive and negative elements." contradicts with your following statements:

"The probability that all elements share the same sign is one in ##2^{N−1}##"
 
  • #101
Mike_bb said:
(Google AI says it's true)
There is a reason why Google AI and other LLMs are not acceptable sources here. Stop quoting them, stop wasting your time trying to understand physics from what they say.
 
  • Like
Likes Steve4Physics and jbriggs444
  • #102
Mike_bb said:
Your statement "Yes, any large sample from any distribution which is symmetric about zero is overwhelmingly likely to have both positive and negative elements." contradicts with your following statements:

"The probability that all elements share the same sign is one in ##2^{N−1}##"
I see no contradiction. I see consistency.

The probability that ##10^{23}## independent samples drawn from a normal distribution with mean zero will all share the same sign is ##\frac{1}{2^{10^{23}-1}}##. This is effectively impossible. And agrees with what I said.

The probability that ##10^{23}## independent samples drawn from a normal distribution with mean zero will not all share the same sign is ##1 - \frac{1}{2^{10^{23}-1}}##. This is virtually certain. And agrees with what I said.

It is possible that we have a language issue here. So I am giving you the benefit of the doubt in this case.
 
  • Like
  • Agree
Likes Nugatory and Mike_bb
  • #103
jbriggs444 said:
It is possible that we have a language issue here. So I am giving you the benefit of the doubt in this case.
Yes. Along with a background that doesn't include completion of the university-level introductory physics course.

Over 100 posts in this thread attempting to explain the relationship between probability and sample size.

It's going in circles. The OP just keeps finding other sources to help him understand, but all they do is cause tangential confusions.
 
  • #104
Herman Trivilino said:
Yes. Along with a background that doesn't include completion of the university-level introductory physics course.

Over 100 posts in this thread attempting to explain the relationship between probability and sample size.

It's going in circles. The OP just keeps finding other sources to help him understand, but all they do is cause tangential confusions.
I have university-level introductory physics course but after long break I don't remember some things.
 
  • #105
Mike_bb said:
I have university-level introductory physics course but after long break I don't remember some things.
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so. “ – Mark Twain
 
  • #106
Mike_bb said:
I have university-level introductory physics course but after long break I don't remember some things.

Well then, I don't know what you meant by what you wrote in Post #39.

Mike_bb said:
(In reality, I have incomplete course of Physics in university)

You either completed the course or you didn't.
 
  • #107
Herman Trivilino said:
Well then, I don't know what you meant by what you wrote in Post #39.



You either completed the course or you didn't.
Yes. In Russia in most universities incomplete course is introductory physics course. (base course)
 
  • #108
Mike_bb said:
I have university-level introductory physics course but after long break I don't remember some things.
That's why I keep telling you to start with the textbook for that course.
 
  • #109
Mike_bb said:
Yes. In Russia in most universities incomplete course is introductory physics course. (base course)
No it's not. Nowhere is an introductory course an incomplete course. Either you complete the introductory course or you don't.
 
  • #110
Herman Trivilino said:
No it's not. Nowhere is an introductory course an incomplete course. Either you complete the introductory course or you don't.
Complete course = base course + advanced course

Complete is equivalent for "full course" in russian
 
  • #111
I didn't read every post, but Reif's book Statistical and Thermal Physics would answer just about anything you would want to know on how the Maxwell-Boltzmann distribution works, and how the states are counted.
 
  • Informative
Likes Gavran and Mike_bb
  • #112
Mike_bb said:
Complete course = base course + advanced course

Complete is equivalent for "full course" in russian

Interesting. How much time would this take?
 
  • #113
Herman Trivilino said:
Interesting. How much time would this take?
Incomplete (base course) - 4 semesters.
Complete - 6-8 semesters
 
  • Informative
Likes Herman Trivilino
  • #114
Mike_bb said:
Incomplete (base course) - 4 semesters.

In the US that would be called the introductory sequence of courses. It varies but I had 3 semesters of introductory physics (introduction to newtonian physics) plus one semester of modern physics. One could complete or not complete any one of the courses, with or without completing the entire sequence. That is what confused me when you said you had not completed the course.

Mike_bb said:
Complete - 6-8 semesters

8 semesters would be equivalent to a bachelors degree here. After the 4-semester sequence one would complete the degree by taking 4 more semesters that would include courses in thermodynamics and statistical physics, mechanics, electromagnetism, and quantum mechanics.

This is also referred to as the undergraduate course of study. In there one might encounter that book you're reading, or it might be later as part of an advanced degree such as a masters degree or a Ph.D.

Along with these physics courses one would also be taking math courses in calculus, differential equations, linear algebra, statistics, etc.

So again I ask you, what is you're trying to get out of this book? Because from the very beginning of this thread you have been asking questions about the distributions of velocity components of gas molecules. We answer, and then you ask questions that address those same issues. Repeatedly. We don't seem to be able to get through to you.

I would urge you to first read and review the material in those introductory courses on vectors and the ideal gas law. It would make for a much more productive discussion.
 
  • #115
Herman Trivilino said:
In the US that would be called the introductory sequence of courses. It varies but I had 3 semesters of introductory physics (introduction to newtonian physics) plus one semester of modern physics. One could complete or not complete any one of the courses, with or without completing the entire sequence. That is what confused me when you said you had not completed the course.
I had 4 semesters. Mechanics and molecular physics, electromagnetism and waves, optics and atomic physics, quantum mechanics.
Herman Trivilino said:
8 semesters would be equivalent to a bachelors degree here. After the 4-semester sequence one would complete the degree by taking 4 more semesters that would include courses in thermodynamics and statistical physics, mechanics, electromagnetism, and quantum mechanics.
Yes, 8 semesters in Russian universities would be equivalent to a bachelors as in USA.
Herman Trivilino said:
Along with these physics courses one would also be taking math courses in calculus, differential equations, linear algebra, statistics, etc.
I had 4 semesters. Linear algebra, analytic geometry, vector calculus/vector algebra, differential calculus, integral calculus and series, multivariable calculus, differential equations, probability theory and statistics, theory of functions of a complex variable.
Herman Trivilino said:
So again I ask you, what is you're trying to get out of this book? Because from the very beginning of this thread you have been asking questions about the distributions of velocity components of gas molecules. We answer, and then you ask questions that address those same issues. Repeatedly. We don't seem to be able to get through to you.

I would urge you to first read and review the material in those introductory courses on vectors and the ideal gas law. It would make for a much more productive discussion.
I expected that you ask me about it. Physics was very difficult to me in university and as I mentioned above, after long break I don't remember many things. Not long ago I took my old school book to repeat and it was written there that ##<V_x^2> = <V_y^2> = <V_z^2>## because X-axis, Y-axis and Z-axis are equiprobable.
But I wanted to know in more details about this fact. That's all.

P.S. Now I fully (as it seems to me) understand that it's written in that book. I would like to thank you all for help, patience and feedback!:smile:
 
  • Like
Likes Herman Trivilino
  • #116
Mike_bb said:
Not long ago I took my old school book to repeat and it was written there that ##<V_x^2> = <V_y^2> = <V_z^2>## because X-axis, Y-axis and Z-axis are equiprobable.
It may be a language issue, but "equiprobable" is not the correct term to use.

If ##V_x## is a random variable then we could speak of the probability that ##V_x## falls within a particular range. We could speak of the probability that ##V_y## falls within the same range. Or ##V_z##.

If the coordinate velocities are independent and identically distributed then those three events (that the random variables fall into a particular range) would be equiprobable. The converse does not hold.

The premise that the book should be using is that ##V_x##, ##V_y## and ##V_z## are "independent and identically distributed". Not that they are "equiprobable". From that starting point, it is trivially true that any statistical measure (such as mean square) of one distribution will be identical to the same statistical measure of any other identical distribution.

This is all before we get into a discussion about the distinction between the expected value for a distribution and the mean of a particular sample.
 
  • #117
jbriggs444 said:
The premise that the book should be using is that Vx, Vy and Vz are "independent and identically distributed". Not that they are "equiprobable".
But the book is written in Russian, so it seems it hasn't been translated correctly?

Likely the translation algorithm is for everyday terminology, not technical jargon. Who knows?
 
  • Like
  • Agree
Likes jbriggs444 and Mike_bb

Similar threads

Replies
5
Views
3K
Replies
15
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
12K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 9 ·
Replies
9
Views
18K
  • · Replies 2 ·
Replies
2
Views
1K