I Evolution of a Boltzman distribution

AI Thread Summary
In a classical system at a fixed temperature, energy measurements can be described by a Boltzmann distribution, especially as the number of measurements increases. The discussion raises the question of whether repeated measurements on the same system over time would still yield a Boltzmann distribution, or if the measurements are correlated, affecting the probability of subsequent measurements based on prior results. It highlights the distinction between ensemble averages and time averages in statistical physics, emphasizing that for systems in equilibrium, these averages are generally equivalent, as stated in the ergodic theorem. The nuances of measurement correlations and their implications for statistical behavior are acknowledged but not deeply explored. Understanding these concepts is essential for interpreting results in classical statistical mechanics.
kelly0303
Messages
573
Reaction score
33
Hello! Assume I have a classical system at a fixed temperature, such that the energy can be described by a Boltzmann distribution at that temperature. If I have a huge number of such systems in that state, and I measure the energy of each one, independently, the probability of measuring a given energy would reach the Boltzmann distribution (in the limit of a large number of measurements). However, if I measure the energy of a system to be ##E_1## and a time ##t## later I measure the same system, and I repeat that many times, would I still get a Boltzmann distribution. My question here is in the classical case, I am not talking about wavefunction collapse (also the way you measure the energy shouldn't be important either). My question mainly is, are the measurements correlated, such that for a given time interval between measurements, the probability of the second measurement depends on the value of the first one?
 
Physics news on Phys.org
Just a point of nomenclature: you are worrying about the difference between an ensemble average and a time average for a random process. For a discreet random process (say a coin toss) the two are equivalent I think . There are clearly many nuances here, which is why I offer you the nomenclature for further study...and bow out..
 
One of the major assumptions of statistical physics is that for a system at equilibrium, ensemble average = time average. This is often referred to as the ergodic theorem.
 
Thread 'Gauss' law seems to imply instantaneous electric field'
Imagine a charged sphere at the origin connected through an open switch to a vertical grounded wire. We wish to find an expression for the horizontal component of the electric field at a distance ##\mathbf{r}## from the sphere as it discharges. By using the Lorenz gauge condition: $$\nabla \cdot \mathbf{A} + \frac{1}{c^2}\frac{\partial \phi}{\partial t}=0\tag{1}$$ we find the following retarded solutions to the Maxwell equations If we assume that...
Hello! Let's say I have a cavity resonant at 10 GHz with a Q factor of 1000. Given the Lorentzian shape of the cavity, I can also drive the cavity at, say 100 MHz. Of course the response will be very very weak, but non-zero given that the Loretzian shape never really reaches zero. I am trying to understand how are the magnetic and electric field distributions of the field at 100 MHz relative to the ones at 10 GHz? In particular, if inside the cavity I have some structure, such as 2 plates...
Back
Top