In summary, the conversation discusses a problem involving calculating the expectation/mean in a process similar to geometric brownian motion. The process includes a stochastic variable delta-correlated with Gaussian statistics, and the goal is to find the ensemble average of the power series terms. The conversation also mentions a document with a similar approach and a claim that for a stationary Gaussian process, the behavior is determined by the second order behavior. However, a proof for this claim is not provided and the calculation of the ensemble average for more than two terms is still unclear.
  • #1
JohnFrum
3
0

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
[tex]\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle[/tex]
Where [itex]\xi(t)[/itex] is delta-correlated so that [itex]\langle\xi(t)\rangle=0[/itex], and [itex]\langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2)[/itex]. I also know that [itex]\xi(t)[/itex] obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
[tex]1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...[/tex]
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.
 
Physics news on Phys.org
  • #2
JohnFrum said:

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
[tex]\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle[/tex]
Where [itex]\xi(t)[/itex] is delta-correlated so that [itex]\langle\xi(t)\rangle=0[/itex], and [itex]\langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2)[/itex]. I also know that [itex]\xi(t)[/itex] obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
[tex]1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...[/tex]
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.
I interpret ##\langle y \rangle## as
##\int_{-\infty}^{+\infty}\psi^*(x) y(x)\psi(x)dx##
where ##f(x) = \psi^*(x)\psi(x)## is a probability density function.

You seem to have left out the probability density function for x in your solution.

I am not sure that this is the problem you are really trying to solve. It would help to see the whole problem and your solution so far.
 
  • #3
Hi, thanks for your reply. I'll try ot give the problem a bit more fully.
I'm considering the stochastic differential equation
[tex]\frac{dy}{dx}=y(x)\xi(x)[/tex]
Where as I mentioned above, [itex]\xi(t)[/itex] is a stochastic variable obeying stationary gaussian statistics.
Trivially this can be integrated to yield;
[tex]y(t)=y(0)e^{\int_{0}^{x}\xi(t)dt}[/tex]
I then want to find the ensemble average, [itex]\langle y(t)\rangle[/itex], which leads to my previous question.(Incidentally if there is another way to find this without the power series expansion, that would be great!)
I've loosely been following this document specifically page 2, and they use the same approach I have. They claim that given [itex]\langle \xi(t) \rangle =0[/itex] and [itex]\langle \xi(t_1)\xi(t_2) \rangle = \delta(t_1-t_2)[/itex], we have that [itex]\langle \xi(t_1)\xi(t_2)...\xi(t_{2n+1}) \rangle=0[/itex] and [itex]\langle \xi(t_1)\xi(t_2)...\xi(t_{2n})\rangle =\sum_{Permutations}\delta(t_{i_{1}}-t_{i_{2}})\delta(t_{i_{3}}-t_{i_{4}})...\delta(t_{i_{2n-1}}-t_{i_{2n}})[/itex].
The document I linked gives the result without proof and I've been unable to find a proof elsewhere online. Wikipedia did state that for a stationary Gaussian process the behaviour is determined by the second order beaviour (which is I guess what we're seeing here), but it didn't show exactly how (at least not in physicist's language I can understand!).

Given this result however, the ensemble average of the power series terms can be found and the series summed exactly to find the result I'm after, but I have no idea how to find the ensemble average of more than two [itex]\xi(t)[/itex] terms. In fact now I come to think of it I don't really know exactly what [itex]\langle \xi(t)\rangle[/itex] or [itex]\langle \xi(t_1)\xi(t_2) \rangle[/itex] means in a strict mathematical sense.
I hope this clarrifies my problem a little and thanks again for your response.
 
Last edited:
  • #4
JohnFrum said:
Hi, thanks for your reply. I'll try ot give the problem a bit more fully.
I'm considering the stochastic differential equation
[tex]\frac{dy}{dx}=y(x)\xi(x)[/tex]
Where as I mentioned above, [itex]\xi(t)[/itex] is a stochastic variable obeying stationary gaussian statistics.
Trivially this can be integrated to yield;
[tex]y(t)=y(0)e^{\int_{0}^{x}\xi(t)dt}[/tex]
I then want to find the ensemble average, [itex]\langle y(t)\rangle[/itex], which leads to my previous question.(Incidentally if there is another way to find this without the power series expansion, that would be great!)
I've loosely been following this document specifically page 2, and they use the same approach I have. They claim that given [itex]\langle \xi(t) \rangle =0[/itex] and [itex]\langle \xi(t_1)\xi(t_2) \rangle = \delta(t_1-t_2)[/itex], we have that [itex]\langle \xi(t_1)\xi(t_2)...\xi(t_{2n+1}) \rangle=0[/itex] and [itex]\langle \xi(t_1)\xi(t_2)...\xi(t_{2n})\rangle =\sum_{Permutations}\delta(t_{i_{1}}-t_{i_{2}})\delta(t_{i_{3}}-t_{i_{4}})...\delta(t_{i_{2n-1}}-t_{i_{2n}})[/itex].
The document I linked gives the result without proof and I've been unable to find a proof elsewhere online. Wikipedia did state that for a stationary Gaussian process the behaviour is determined by the second order beaviour (which is I guess what we're seeing here), but it didn't show exactly how (at least not in physicist's language I can understand!).

Given this result however, the ensemble average of the power series terms can be found and the series summed exactly to find the result I'm after, but I have no idea how to find the ensemble average of more than two [itex]\xi(t)[/itex] terms. In fact now I come to think of it I don't really know exactly what [itex]\langle \xi(t)\rangle[/itex] or [itex]\langle \xi(t_1)\xi(t_2) \rangle[/itex] means in a strict mathematical sense.
I hope this clarrifies my problem a little and thanks again for your response.
I think you mean ##y(x)=y(0)e^{\int_{0}^{x}\xi(t)dt}##

##\langle \xi(t)\rangle = \int_{-\infty}^{\infty} \xi f(\xi) d\xi## is the mean of \xi at time t. Since you are looking at a stationary process, the mean does not depend on t. You seem to be thinking of this as an ensemble average, but I am not sure that is correct.

Your document calls ##\langle \xi(t_1)\xi(t_2) \rangle## the correlation function, but it looks to me like it should be the covariance ##C(t_1,t_2) = E\{[\xi(t_1)-\langle \xi(t)\rangle] [\xi(t_2)-\langle \xi(t_2)\rangle]\}##. This varies depending on the stochastic process you are modeling, and you will have to decide on one to get any further in your calculations. Look up the Wiener process or the Ornstein-Uhlenbeck process for examples.

Keep reading to page 3. It gives a simpler solution than the intermediate one you are looking at.
 
Last edited:
  • #5
JohnFrum said:

Homework Statement


I'm working on a process similar to geometric brownian motion (a process with multiplicative noise), and I need to calculate the following expectation/mean;
[tex]\langle y \rangle=\langle e^{\int_{0}^{x}\xi(t)dt}\rangle[/tex]
Where [itex]\xi(t)[/itex] is delta-correlated so that [itex]\langle\xi(t)\rangle=0[/itex], and [itex]\langle\xi(t_1)\xi(t_2)\rangle=\delta(t_1-t_2)[/itex]. I also know that [itex]\xi(t)[/itex] obeys Gaussian statistics at all times.

Homework Equations


Those given above.

The Attempt at a Solution


All I could think of doing was expanding the exponential in a power series, this give me:
[tex]1+\frac{1}{2}x+\frac{1}{3!}\int_{0}^{x}\int_{0}^{x}\int_{0}^{x}\langle\xi(t_{1})\xi(t_{2})\xi(t_{3})\rangle dt_{1}dt_{2}dt_{3}+...[/tex]
But I have no idea how to deal with the correlations fo more than 2 of the noise terms. I am new to stochastic processes, so sorry if this is a silly question, but any help and/or guidance would be much appreciated.

By the way, feel free to move this to the maths section if it would suit better there, I wasn't sure which it suited better.

The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
 
  • #6
Ray Vickson said:
The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
##\exp(\int_0^t \xi(\tau) \, d\tau)## would have a lognormal distribution if ##\int_0^t \xi(\tau)## were a Gaussian random variable. That would be true if ##\xi(t)## were a Wiener process. Is it true for stationary processes in general?
 
  • #7
tnich said:
##\exp(\int_0^t \xi(\tau) \, d\tau)## would have a lognormal distribution if ##\int_0^t \xi(\tau)## were a Gaussian random variable. That would be true if ##\xi(t)## were a Wiener process. Is it true for stationary processes in general?

Think of a sum instead of an integral: we have a sum of normally-distributed random variables. While the summands are dependent (being correlated), they are part of a large multivariate normal vector ##(\xi(t_1), \xi(t_2), \ldots, \xi(t_n))##, and so the sum ##\xi(t_1) + \xi(t_2) + \cdots + \xi(t_n)## is itself normal (as is any constant-coefficient sum of the form ##a_1 \xi(t_1) + a_2 \xi(t_2) + \cdots + a_n \xi(t_n)##).
See, eg., http://www.maths.manchester.ac.uk/~mkt/MT3732%20(MVA)/Notes/MVA_Section3.pdf
or https://brilliant.org/wiki/multivariate-normal-distribution/
 
Last edited:
  • #8
tnich said:
I think you mean ##y(x)=y(0)e^{\int_{0}^{x}\xi(t)dt}##

##\langle \xi(t)\rangle = \int_{-\infty}^{\infty} \xi f(\xi) d\xi## is the mean of \xi at time t. Since you are looking at a stationary process, the mean does not depend on t. You seem to be thinking of this as an ensemble average, but I am not sure that is correct.

Your document calls ##\langle \xi(t_1)\xi(t_2) \rangle## the correlation function, but it looks to me like it should be the covariance ##C(t_1,t_2) = E\{[\xi(t_1)-\langle \xi(t)\rangle] [\xi(t_2)-\langle \xi(t_2)\rangle]\}##. This varies depending on the stochastic process you are modeling, and you will have to decide on one to get any further in your calculations. Look up the Wiener process or the Ornstein-Uhlenbeck process for examples.

Keep reading to page 3. It gives a simpler solution than the intermediate one you are looking at.

Youre correct about it being [itex]y(x)[/itex] rather than [itex]y(t)[/itex] sorry for the typo.
[itex]\langle y(x) \rangle[/itex] is supposed to be the mean. Sorry, I know it sounds silly but I didnt realize the mean and ensemble average were different until you mentioned it. (I am really very new to this sort of thing).
After looking into it I think that you're right and ##\langle \xi(t_1)\xi(t_2) \rangle## is supposed to be the covariance function, but then what would, for example ##\langle \xi(t_1)\xi(t_2)\xi(t_3)\rangle## be?
As stated, ##\xi(t)## is a delta correlated Gaussian process, which I think is equivalent to a Wiener process, correct? This would give us the PDF ##p(x,t)=\frac{1}{\sqrt{2\pi t}}e^{-x^2/2t}## for ##\xi(t)##.
I did have a look at page 3 and onwards of the document, and the simpler solution they give is what I'm after, but it seems that to obtain it I need expression they used for the ##\langle \xi(t_1)\xi(t_2)\xi(t_3),,\xi(t_n)\rangle## terms, which I have no idea how to obtain.
Thanks again for you valuable reply and your help, and sorry that lots of my terminology is a bit vague, I'm still very new to statistical mechanics and probability theory in general.

Ray Vickson said:
The random variable ##Y(t) = \exp(\int_0^t \xi(\tau) \, d\tau)## has a lognormal distribution. It is essentially elementary to obtain the mean and variance of a lognormal R.V. in terms of the mean and variance of the underlying normal. Google "lognormal distribution".
This is a really clever idea, thanks.
 
  • #9
JohnFrum said:
As stated, ##\xi(t)## is a delta correlated Gaussian process, which I think is equivalent to a Wiener process, correct? This would give us the PDF ##p(x,t)=\frac{1}{\sqrt{2\pi t}}e^{-x^2/2t}## for ##\xi(t)##.
Yes, it is a Wiener process, so you can use properties of the Wiener process, one of which is ##\int_{0}^{x}\xi(t)dt \sim N(0, x)##. That is a little different than what you have stated above. So now you can use @RayVickson's idea directly without calculating any covariances.
 

1. What does the expectation value of a stochastic quantity represent?

The expectation value of a stochastic quantity represents the average value that would be obtained from multiple measurements of that quantity in a given system.

2. How is the expectation value calculated?

The expectation value is calculated by multiplying each possible value of the stochastic quantity by its corresponding probability and then summing all of these products.

3. What is the significance of the expectation value in probability theory?

The expectation value is a fundamental concept in probability theory as it represents the most likely outcome of a random event or process.

4. Can the expectation value be negative?

Yes, the expectation value can be negative if the probabilities of the possible values are also negative. This can occur in systems with asymmetric distributions of probabilities.

5. How is the expectation value related to the variance?

The variance is a measure of the spread of the possible values of a stochastic quantity. It is related to the expectation value by the formula: variance = expectation value of (stochastic quantity)^2 - (expectation value of stochastic quantity)^2.

Similar threads

Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
4
Views
923
  • Differential Equations
Replies
0
Views
299
  • Advanced Physics Homework Help
Replies
8
Views
3K
  • Advanced Physics Homework Help
Replies
2
Views
2K
  • Advanced Physics Homework Help
Replies
12
Views
2K
Replies
2
Views
851
  • Advanced Physics Homework Help
Replies
30
Views
1K
  • Differential Equations
Replies
1
Views
187
Back
Top