Expectation Value of a Stochastic Quantity

Click For Summary
SUMMARY

The forum discussion centers on calculating the expectation value of a stochastic quantity modeled by a process similar to geometric Brownian motion, specifically the expression \(\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle\). The variable \(\xi(t)\) is delta-correlated with zero mean and obeys Gaussian statistics. Participants discuss the challenges of expanding the exponential in a power series and the need to understand higher-order correlations of the noise terms. The discussion highlights the importance of recognizing the relationship between the mean and ensemble average in stochastic processes.

PREREQUISITES
  • Understanding of stochastic processes, particularly Gaussian processes.
  • Familiarity with the concepts of expectation values and ensemble averages.
  • Knowledge of delta-correlated noise and its implications in stochastic modeling.
  • Basic understanding of lognormal distributions and their properties.
NEXT STEPS
  • Research the properties of the Wiener process and its applications in stochastic calculus.
  • Study the derivation of the covariance function for delta-correlated Gaussian processes.
  • Explore the lognormal distribution and its mean and variance in relation to underlying normal variables.
  • Investigate methods for calculating higher-order correlations in stochastic processes.
USEFUL FOR

This discussion is beneficial for students and researchers in statistical mechanics, applied mathematics, and financial modeling, particularly those working with stochastic differential equations and Gaussian processes.

JohnFrum
Messages
3
Reaction score
0

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle
Where \xi(t) is delta-correlated so that \langle\xi(t)\rangle=0, and \langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2). I also know that \xi(t) obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.
 
Physics news on Phys.org
JohnFrum said:

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle
Where \xi(t) is delta-correlated so that \langle\xi(t)\rangle=0, and \langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2). I also know that \xi(t) obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.
I interpret ##\langle y \rangle## as
##\int_{-\infty}^{+\infty}\psi^*(x) y(x)\psi(x)dx##
where ##f(x) = \psi^*(x)\psi(x)## is a probability density function.

You seem to have left out the probability density function for x in your solution.

I am not sure that this is the problem you are really trying to solve. It would help to see the whole problem and your solution so far.
 
Hi, thanks for your reply. I'll try ot give the problem a bit more fully.
I'm considering the stochastic differential equation
\frac{dy}{dx}=y(x)\xi(x)
Where as I mentioned above, \xi(t) is a stochastic variable obeying stationary gaussian statistics.
Trivially this can be integrated to yield;
y(t)=y(0)e^{\int_{0}^{x}\xi(t)dt}
I then want to find the ensemble average, \langle y(t)\rangle, which leads to my previous question.(Incidentally if there is another way to find this without the power series expansion, that would be great!)
I've loosely been following this document specifically page 2, and they use the same approach I have. They claim that given \langle \xi(t) \rangle =0 and \langle \xi(t_1)\xi(t_2) \rangle = \delta(t_1-t_2), we have that \langle \xi(t_1)\xi(t_2)...\xi(t_{2n+1}) \rangle=0 and \langle \xi(t_1)\xi(t_2)...\xi(t_{2n})\rangle =\sum_{Permutations}\delta(t_{i_{1}}-t_{i_{2}})\delta(t_{i_{3}}-t_{i_{4}})...\delta(t_{i_{2n-1}}-t_{i_{2n}}).
The document I linked gives the result without proof and I've been unable to find a proof elsewhere online. Wikipedia did state that for a stationary Gaussian process the behaviour is determined by the second order beaviour (which is I guess what we're seeing here), but it didn't show exactly how (at least not in physicist's language I can understand!).

Given this result however, the ensemble average of the power series terms can be found and the series summed exactly to find the result I'm after, but I have no idea how to find the ensemble average of more than two \xi(t) terms. In fact now I come to think of it I don't really know exactly what \langle \xi(t)\rangle or \langle \xi(t_1)\xi(t_2) \rangle means in a strict mathematical sense.
I hope this clarrifies my problem a little and thanks again for your response.
 
Last edited:
JohnFrum said:
Hi, thanks for your reply. I'll try ot give the problem a bit more fully.
I'm considering the stochastic differential equation
\frac{dy}{dx}=y(x)\xi(x)
Where as I mentioned above, \xi(t) is a stochastic variable obeying stationary gaussian statistics.
Trivially this can be integrated to yield;
y(t)=y(0)e^{\int_{0}^{x}\xi(t)dt}
I then want to find the ensemble average, \langle y(t)\rangle, which leads to my previous question.(Incidentally if there is another way to find this without the power series expansion, that would be great!)
I've loosely been following this document specifically page 2, and they use the same approach I have. They claim that given \langle \xi(t) \rangle =0 and \langle \xi(t_1)\xi(t_2) \rangle = \delta(t_1-t_2), we have that \langle \xi(t_1)\xi(t_2)...\xi(t_{2n+1}) \rangle=0 and \langle \xi(t_1)\xi(t_2)...\xi(t_{2n})\rangle =\sum_{Permutations}\delta(t_{i_{1}}-t_{i_{2}})\delta(t_{i_{3}}-t_{i_{4}})...\delta(t_{i_{2n-1}}-t_{i_{2n}}).
The document I linked gives the result without proof and I've been unable to find a proof elsewhere online. Wikipedia did state that for a stationary Gaussian process the behaviour is determined by the second order beaviour (which is I guess what we're seeing here), but it didn't show exactly how (at least not in physicist's language I can understand!).

Given this result however, the ensemble average of the power series terms can be found and the series summed exactly to find the result I'm after, but I have no idea how to find the ensemble average of more than two \xi(t) terms. In fact now I come to think of it I don't really know exactly what \langle \xi(t)\rangle or \langle \xi(t_1)\xi(t_2) \rangle means in a strict mathematical sense.
I hope this clarrifies my problem a little and thanks again for your response.
I think you mean ##y(x)=y(0)e^{\int_{0}^{x}\xi(t)dt}##

##\langle \xi(t)\rangle = \int_{-\infty}^{\infty} \xi f(\xi) d\xi## is the mean of \xi at time t. Since you are looking at a stationary process, the mean does not depend on t. You seem to be thinking of this as an ensemble average, but I am not sure that is correct.

Your document calls ##\langle \xi(t_1)\xi(t_2) \rangle## the correlation function, but it looks to me like it should be the covariance ##C(t_1,t_2) = E\{[\xi(t_1)-\langle \xi(t)\rangle] [\xi(t_2)-\langle \xi(t_2)\rangle]\}##. This varies depending on the stochastic process you are modeling, and you will have to decide on one to get any further in your calculations. Look up the Wiener process or the Ornstein-Uhlenbeck process for examples.

Keep reading to page 3. It gives a simpler solution than the intermediate one you are looking at.
 
Last edited:
JohnFrum said:

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle
Where \xi(t) is delta-correlated so that \langle\xi(t)\rangle=0, and \langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2). I also know that \xi(t) obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.

The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
 
Ray Vickson said:
The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
##\exp(\int_0^t \xi(\tau) \, d\tau)## would have a lognormal distribution if ##\int_0^t \xi(\tau)## were a Gaussian random variable. That would be true if ##\xi(t)## were a Wiener process. Is it true for stationary processes in general?
 
tnich said:
##\exp(\int_0^t \xi(\tau) \, d\tau)## would have a lognormal distribution if ##\int_0^t \xi(\tau)## were a Gaussian random variable. That would be true if ##\xi(t)## were a Wiener process. Is it true for stationary processes in general?

Think of a sum instead of an integral: we have a sum of normally-distributed random variables. While the summands are dependent (being correlated), they are part of a large multivariate normal vector ##(\xi(t_1), \xi(t_2), \ldots, \xi(t_n))##, and so the sum ##\xi(t_1) + \xi(t_2) + \cdots + \xi(t_n)## is itself normal (as is any constant-coefficient sum of the form ##a_1 \xi(t_1) + a_2 \xi(t_2) + \cdots + a_n \xi(t_n)##).
See, eg., http://www.maths.manchester.ac.uk/~mkt/MT3732%20(MVA)/Notes/MVA_Section3.pdf
or https://brilliant.org/wiki/multivariate-normal-distribution/
 
Last edited:
tnich said:
I think you mean ##y(x)=y(0)e^{\int_{0}^{x}\xi(t)dt}##

##\langle \xi(t)\rangle = \int_{-\infty}^{\infty} \xi f(\xi) d\xi## is the mean of \xi at time t. Since you are looking at a stationary process, the mean does not depend on t. You seem to be thinking of this as an ensemble average, but I am not sure that is correct.

Your document calls ##\langle \xi(t_1)\xi(t_2) \rangle## the correlation function, but it looks to me like it should be the covariance ##C(t_1,t_2) = E\{[\xi(t_1)-\langle \xi(t)\rangle] [\xi(t_2)-\langle \xi(t_2)\rangle]\}##. This varies depending on the stochastic process you are modeling, and you will have to decide on one to get any further in your calculations. Look up the Wiener process or the Ornstein-Uhlenbeck process for examples.

Keep reading to page 3. It gives a simpler solution than the intermediate one you are looking at.

Youre correct about it being y(x) rather than y(t) sorry for the typo.
\langle y(x) \rangle is supposed to be the mean. Sorry, I know it sounds silly but I didnt realize the mean and ensemble average were different until you mentioned it. (I am really very new to this sort of thing).
After looking into it I think that you're right and ##\langle \xi(t_1)\xi(t_2) \rangle## is supposed to be the covariance function, but then what would, for example ##\langle \xi(t_1)\xi(t_2)\xi(t_3)\rangle## be?
As stated, ##\xi(t)## is a delta correlated Gaussian process, which I think is equivalent to a Wiener process, correct? This would give us the PDF ##p(x,t)=\frac{1}{\sqrt{2\pi t}}e^{-x^2/2t}## for ##\xi(t)##.
I did have a look at page 3 and onwards of the document, and the simpler solution they give is what I'm after, but it seems that to obtain it I need expression they used for the ##\langle \xi(t_1)\xi(t_2)\xi(t_3),,\xi(t_n)\rangle## terms, which I have no idea how to obtain.
Thanks again for you valuable reply and your help, and sorry that lots of my terminology is a bit vague, I'm still very new to statistical mechanics and probability theory in general.

Ray Vickson said:
The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
This is a really clever idea, thanks.
 
JohnFrum said:
As stated, ##\xi(t)## is a delta correlated Gaussian process, which I think is equivalent to a Wiener process, correct? This would give us the PDF ##p(x,t)=\frac{1}{\sqrt{2\pi t}}e^{-x^2/2t}## for ##\xi(t)##.
Yes, it is a Wiener process, so you can use properties of the Wiener process, one of which is ##\int_{0}^{x}\xi(t)dt \sim N(0, x)##. That is a little different than what you have stated above. So now you can use @RayVickson's idea directly without calculating any covariances.
 

Similar threads

Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 8 ·
Replies
8
Views
5K
Replies
1
Views
2K
Replies
2
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
Replies
1
Views
2K