Graduate Proof of Classical Fluctuation-Dissipation Theorem

Click For Summary
SUMMARY

The forum discussion centers on the proof of the Classical Fluctuation-Dissipation Theorem (FDT) as outlined on Wikipedia. The user successfully derived the first-order perturbation expression for the probability density function, $$W(x,0)$$, but seeks assistance in proving the expected value $$\langle x(t) \rangle$$ using the conditional probability $$P(x,t|x',0)$$. The user expresses uncertainty regarding the evaluation of the integral $$I = \int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') x'(0)$$ and raises concerns about the accuracy of the FDT as stated in the Wikipedia article.

PREREQUISITES
  • Understanding of probability density functions in statistical mechanics
  • Familiarity with perturbation theory and Taylor series expansions
  • Knowledge of the Fluctuation-Dissipation Theorem (FDT)
  • Experience with conditional probabilities in dynamical systems
NEXT STEPS
  • Research the derivation of the Classical Fluctuation-Dissipation Theorem in detail
  • Learn how to evaluate integrals involving conditional probabilities in statistical mechanics
  • Study the implications of the Langevin equation in the context of FDT
  • Explore alternative proofs of the Fluctuation-Dissipation Theorem for deeper understanding
USEFUL FOR

Physicists, particularly those specializing in statistical mechanics, researchers studying dynamical systems, and graduate students seeking to understand the Fluctuation-Dissipation Theorem and its applications.

Twigg
Science Advisor
Gold Member
Messages
893
Reaction score
483
TL;DR
Found a proof of the classical version of the fluctuation-dissipation theorem, couldn't figure out one step. Involves classical transition probabilities for a stochastic, dissipative system. How does one evaluate the integral ##\int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') x'(0)##.
Sorry if there's latex errors. My internet connection is so bad I can't preview.

Here's the wikipedia proof I'm referring to. I'm fine with the steps up to $$W(x,0) = W_0 (x) [1 + \beta f_0 (x(0) - \langle x \rangle_0) ]$$ where ##W(x,t)## is the probability density of finding the system at state ##x## at time ##t##, ##f_0## is the magnitude of a perturbation ##f_0 << kT## on the Hamiltonian H of the form ##H = H_0 - x f_0 \Theta (-t) ## where ##\Theta (t)## is the heaviside step function, ##\langle \cdot \rangle_0## is expectation value in the unperturbed equilibrium distribution (called ##W_0 (x)##), and where ##\beta = \frac{1}{kT}##.

I was able to reproduce the approximate, first order perturbation expression for ##W(x,0)## just fine using taylor/binomial series. It's the next step, using this distribution to prove $$\langle x(t) \rangle = \langle x \rangle_0 + \beta f_0 A(t)$$ where ##A(t) = \langle [x(t) - \langle x \rangle_0][x(0) - \langle x \rangle_0] \rangle_0## is the autocorrelation function.

I'm confident that the general procedure is to write $$\langle x(t) \rangle = \int dx \text{ } x \int dx' P(x,t|x',0) W(x',0) $$ where ##P(x,t|x',0)## is the conditional probability for state x at time t given the system is in state x' at time 0. I can see why ##\int dx' P(x,t|x',0) W_0 (x') = W_0(x)## since ##W_0## is an equilibrium distribution. I also have a hunch that ##\lim_{t \rightarrow \infty} \int dx' P(x,t|x',0) W(x',0) = W_0(x)## for any initial distribution since it's a dissipative system and will reach equilibrium eventually.

However, given all this, what I get is $$\langle x(t) \rangle = \int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') [1 + \beta f_0 (x'(0) - \langle x \rangle_0) ] = \langle x \rangle_0 + \beta f_0 \left[ \int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') x'(0) - \int dx \text{ } x W_0(x) \langle x \rangle_0 \right] = \langle x \rangle_0 + \beta f_0 \left[ \int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') x'(0) - \langle x \rangle_0^2 \right]$$ It's the integral $$I = \int dx \text{ } x \int dx' P(x,t|x',0) W_0 (x') x'(0)$$ that gets me. I have no idea how to evaluate this, because the thing that ##P(x,t|x',0)## is multiplied by isn't a probability distribution. Can anyone point me in the right direction?

Alternatively, if you have a different concise proof, I'm all ears. I was interested in this proof because it's (relatively) short and might be easier to remember.
 
Last edited:
Physics news on Phys.org
Update: I'm having serious doubts with the form of the FDT stated in the wikipedia article. They claim $$S_x (\omega) = \frac{2kT}{\omega} \mathrm{Im}[\chi(\omega)]$$ However, for the vanilla Langevin equation $$m\ddot{x} = -\gamma \dot{x} + \eta$$ I am fairly confident that $$\lim_{t \rightarrow \infty} \langle \dot{x}(t+\tau) \dot{x}(t) \rangle \propto e^{-\gamma \tau}$$ In other words $$\lim_{t \rightarrow \infty} \langle \dot{x}(t+\tau) \dot{x}(t) \rangle \propto \chi(\tau)$$ and that's inconsistent with the wikipedia article's claim since the Fourier transform of ##e^{-\gamma \tau}## has non-zero real part.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 13 ·
Replies
13
Views
3K
Replies
4
Views
2K