Stochastic differential equations question. An over view

Click For Summary
Stochastic differential equations (SDEs) require specialized techniques for solutions, particularly through Ito and Stratonovich calculus, as described in Oksendal's work. These methods convert SDEs into recurrence relations, allowing for the calculation of probabilities by convolving distributions. While Ito's lemma aids in solving certain integrals, the rigorous definition of these equations often relies on integral forms due to the non-differentiable nature of random processes. The discussion highlights the applicability of these techniques across various fields, including biology, physics, finance, and electrical engineering. Understanding the distinctions between Markovian and non-Markovian processes is crucial, as memory effects complicate the solution of SDEs.
rigetFrog
Messages
112
Reaction score
4
I've been reading Oksendal, and it's quite tedious. It want to see if my understanding of the motivation and process is correct.

1) Differential equations that have random variables need special techniques to be solved

2) Ito and Stratonovich extended calculus to apply to random variables.

3) Oksendal uses Ito/Stratonovch calculus to solve differential equations.

4) This method works by converting the differential equation into a recurrence relations (e.g. of the form x(t+1) = x(t)+dt*(a*x(t)+'noise'))

5) This sort of problem can be solved. I.e. The probability P(x(t+1)) can be solved by convolving P(x(t)) with the probability of everything in dt term.

What other nuggets of info should I be taking away from this book?

Are there other techniques for solving stochastic differential equations that don't require converting into recurrence relations?

For EE people, they're typically happy once they have the filter frequency response.

It would be cool to see an over view of how each field (bio, physics, finance, EE, etc...) deals with randomness.
 
Physics news on Phys.org
Typically, you solve the SDE by converting to the associated (Ito or Stratanovich) integral equation. Only the integral equation has a rigorous definition (calculated from the mean square limit) since random processes are very often nowhere differentiable.

Solving an Ito integral sometimes can be done by using Ito's lemma, which is basically the chain rule for Ito calculus. In this way, you can find, for example, that the Langevin equation's solution is the Orstein-Uhlenbeck process (sometimes the OU process itself is defined from the Langevin equation).

The recurrence relations you mention is valid for a Markov process and it looks to me like a regular Langevin equation (if I misspeak, please correct me). But for non-Markovian processes (processes which have "memory"), that equation might not be possible, you will get terms with correlations with previous times. Still, the Ito calculus is well defined for any adapted (non-anticipating) process over a semi-martingale.
 
  • Like
Likes 1 person
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
6
Views
3K