- #1
ErezAgh
- 5
- 0
When using Euler's method of integration, applied on a stochastic differential eq. :
For example - given
d/dt v=−γvΔt+sqrt(ϵ⋅Δt)Γ(t)
we loop over
v[n+1]=v[n]−γv[n]Δt+sqrt(ϵ⋅Δt)Γn.
(where −γv[n] is a force term, can be any force and Γn is some gaussian distributed random variable. ) .
Then if we choose Δt not small enough, we eventually get (if we run over long times, meaning many repeated iterations) that the solutions become "unstable and oscillations appear around the analytic solution, with amplitude becoming larger and larger with time" (~ collected from many different sources I found on the internet that mention the problem but don't discuss it in depth).
Why are these actual OSCILLATIONS, and not simply random fluctuations? What is the period of these oscillations?
For example - given
d/dt v=−γvΔt+sqrt(ϵ⋅Δt)Γ(t)
we loop over
v[n+1]=v[n]−γv[n]Δt+sqrt(ϵ⋅Δt)Γn.
(where −γv[n] is a force term, can be any force and Γn is some gaussian distributed random variable. ) .
Then if we choose Δt not small enough, we eventually get (if we run over long times, meaning many repeated iterations) that the solutions become "unstable and oscillations appear around the analytic solution, with amplitude becoming larger and larger with time" (~ collected from many different sources I found on the internet that mention the problem but don't discuss it in depth).
Why are these actual OSCILLATIONS, and not simply random fluctuations? What is the period of these oscillations?