I Evolution of a Boltzman distribution

AI Thread Summary
In a classical system at a fixed temperature, energy measurements can be described by a Boltzmann distribution, especially as the number of measurements increases. The discussion raises the question of whether repeated measurements on the same system over time would still yield a Boltzmann distribution, or if the measurements are correlated, affecting the probability of subsequent measurements based on prior results. It highlights the distinction between ensemble averages and time averages in statistical physics, emphasizing that for systems in equilibrium, these averages are generally equivalent, as stated in the ergodic theorem. The nuances of measurement correlations and their implications for statistical behavior are acknowledged but not deeply explored. Understanding these concepts is essential for interpreting results in classical statistical mechanics.
kelly0303
Messages
573
Reaction score
33
Hello! Assume I have a classical system at a fixed temperature, such that the energy can be described by a Boltzmann distribution at that temperature. If I have a huge number of such systems in that state, and I measure the energy of each one, independently, the probability of measuring a given energy would reach the Boltzmann distribution (in the limit of a large number of measurements). However, if I measure the energy of a system to be ##E_1## and a time ##t## later I measure the same system, and I repeat that many times, would I still get a Boltzmann distribution. My question here is in the classical case, I am not talking about wavefunction collapse (also the way you measure the energy shouldn't be important either). My question mainly is, are the measurements correlated, such that for a given time interval between measurements, the probability of the second measurement depends on the value of the first one?
 
Physics news on Phys.org
Just a point of nomenclature: you are worrying about the difference between an ensemble average and a time average for a random process. For a discreet random process (say a coin toss) the two are equivalent I think . There are clearly many nuances here, which is why I offer you the nomenclature for further study...and bow out..
 
One of the major assumptions of statistical physics is that for a system at equilibrium, ensemble average = time average. This is often referred to as the ergodic theorem.
 
Thread 'Question about pressure of a liquid'
I am looking at pressure in liquids and I am testing my idea. The vertical tube is 100m, the contraption is filled with water. The vertical tube is very thin(maybe 1mm^2 cross section). The area of the base is ~100m^2. Will he top half be launched in the air if suddenly it cracked?- assuming its light enough. I want to test my idea that if I had a thin long ruber tube that I lifted up, then the pressure at "red lines" will be high and that the $force = pressure * area$ would be massive...
I feel it should be solvable we just need to find a perfect pattern, and there will be a general pattern since the forces acting are based on a single function, so..... you can't actually say it is unsolvable right? Cause imaging 3 bodies actually existed somwhere in this universe then nature isn't gonna wait till we predict it! And yea I have checked in many places that tiny changes cause large changes so it becomes chaos........ but still I just can't accept that it is impossible to solve...
Back
Top