I Question on problem 7 on July Challenge

  • Thread starter BWV
  • Start date

BWV

430
326
Summary
Question on problem 7 on July Challenge
Trying to follow and learn from the solution and did not want to clutter up the original thread

Fubini allows us to change order of integration, so we get
[tex]
\mathbb EX = \mathbb E \left ( \int _0^t W_s^2ds\right ) = \int _0^t \mathbb E(W_s^2)ds = \int_0^t sds = \frac{t^2}{2}
[/tex]
My naive question is why doesn't Jensen's Inequality prevent this step?

246277


Where you are swapping the expectation of a function for applying the function to the expectation which according to the inequality, the two expressions are equal only if the function is linear, which W^2 is not
 

Math_QED

Science Advisor
Homework Helper
1,120
370
Summary: Question on problem 7 on July Challenge

Trying to follow and learn from the solution and did not want to clutter up the original thread



My naive question is why doesn't Jensen's Inequality prevent this step?

View attachment 246277

Where you are swapping the expectation of a function for applying the function to the expectation which according to the inequality, the two expressions are equal only if the function is linear, which W^2 is not
I don't really understand your question. What is ##\varphi## here? Is it ##\varphi(x) = x^2?##. In that case, Jenssen says that

$$E\left[\left(\int_0^t W_s^2 ds\right)^2\right]= E[X^2] \geq (EX)^2 = \left(\int_0^t W_s^2 ds\right)^2$$

and I don't see how that contradicts the statement you mention.
 
Last edited:
471
285
Are you sure this is correct? By definition, if XX is a random variable on a probability space (Ω,F,P)(Ω,F,P), then


E(Y):=∫ΩY(ω)P(dω)E(Y):=∫ΩY(ω)P(dω)​
Agreed, my initial post didn't make much sense at all. Post-edit it should be ok.

I don't really understand your question. What is φφ here?
My guess is he assumed I used Jensen's inequality to arrive at whatever I arrived at, the [itex]\varphi[/itex] is supposed to be a convex map, which one, I can't say.
 
471
285
Now it is correct, though I'm not sure if this integral exists if and only if ##\int_\Omega Y dP## exists. It is definitely true that if the expectation exists and is finite, then your integral exists and is equal to the expectation.
Right right. The equality holds precisely when [itex]Y[/itex] has a probability density function. My probability theory has several layers of rust on it by now :(
 

Math_QED

Science Advisor
Homework Helper
1,120
370
Right right. The equality holds precisely when [itex]Y[/itex] has a probability density function. My probability theory has several layers of rust on it by now :(
Now that I think about it, my previous reply was incorrect. Isn't the correct formula

##E(Y) = \int_{\mathbb{R}} yf_Y(y) dy##

where ##f_Y## is a density function of ##Y## (assuming it exists!). The formula you listed in post #3 can't be correct for the simple reason that ##Y(s)## with ##s \in \mathbb{R}## does not make sense because ##Y## is not necessarily defined on the real numbers. I suggest you edit the post with the correct definition ##EY = \int_\Omega Y dP## and then also the rest of the answer.

Indeed, using this definition

$$E(X) = \int_\Omega X dP = \int_\Omega \left(\int_0^t W_s^2 ds\right) dP = \int_0^t \left( \int_\Omega W_s^2 dP\right) ds = \int_0^t E[W_s^2] ds$$

and the third equality follows by Fubini.
 
Last edited:
471
285
@Math_QED Agreed again. My technique is garbage and sloppy. Will try to mend my ways.
 

BWV

430
326
I don't really understand your question. What is ##\varphi## here? Is it ##\varphi(x) = x^2?##. In that case, Jenssen says that

$$E\left[\left(\int_0^t W_s^2 ds\right)^2\right]= E[X^2] \geq (EX)^2 = \left(\int_0^t W_s^2 ds\right)^2$$

and I don't see how that contradicts the statement you mention.
Ok forget it then, thought I could learn something but too far outside my pay grade
 

Math_QED

Science Advisor
Homework Helper
1,120
370
Ok forget it then, thought I could learn something but too far outside my pay grade
Please do ask. I'm just trying to understand your confusion so I can give you my best answer :).
 

BWV

430
326
The question was basic - why doesnt Jensen apply when moving the expectation inside / outside of the integral?

Im sure my confusion stems from what is getting integrated over what -the 'function' here which is being moved is integrating over the variance (time)

Not sure how to interpret integrating X over t, have some experience with stoc calculus but its always derivatives of X as in Ito's Lemma
 

StoneTemplePython

Science Advisor
Gold Member
1,069
508
Ok forget it then, thought I could learn something but too far outside my pay grade
maybe the way to think of it is define ##g## by
##g(W_s)= W_s^2##

so ##g## is strictly convex but

##h_t(X) = \int_0^t X dx##
is linear -- i.e. homogenous with respect to scaling and additivity. Linear functions are convex and concave.


So the problem is

##E\big[ h_t(g(W_s))\big]= h_t(E\big[g(W_s))\big]##
by linearity (subject to convergence considerations -- I'm fairly confident you could justify the interchange on dominated convergence as well )

- - - -
what Jensen says is
## h_t\Big(E\big[g(W_s)\big]\Big) \gt h_t\Big(g(E\big[W_s\big])\Big)##
because ##g## is strictly convex (and ##W_s## isn't constant with probability 1)

or more simply, for any ##s \gt 0##
##E\big[g(W_s)\big] \gt g(E\big[W_s\big])##
then we sum or integrate over this point-wise bound to recover the above
 
  • Like
Reactions: BWV

BWV

430
326
Thanks - so because integration is linear, it meets an equality condition of Jensen & it can be moved in or outside of the expectation operator
 

StoneTemplePython

Science Advisor
Gold Member
1,069
508
Thanks - so because integration is linear, it meets an equality condition of Jensen & it can be moved in or outside of the expectation operator
if we don't worry too much about convergence issues then I suppose you can say this. The more basic idea is Linearity of Expectations. (Then bring up Jensen for functions that aren't linear/affine but are known to be convex or concave)

I'm told that Brownian motion is Riemann integrable, and in any case an important way to think about very complicated stuff is with countable sets when possible (not always possible but you should try) and then countable sets with finite ones wherever possible.

So write out the Riemann Integral which I assume exists

##S_n := \frac{1}{n}\sum_{i=1}^n g(W_{\frac{it}{n}}) \approx \int_{0}^t g(W_s) ds##

and by linearity of expectations, for any natural number ##n##, you always have

##E\big[S_n\big] = E\big[\frac{1}{n}\sum_{i=1}^n g(W_{\frac{it}{n}}) \big] = \frac{1}{n}\sum_{i=1}^n E\big[g(W_{\frac{it}{n}}) \big]##
(this always holds, irrespective of whether they are independent, something a lot of people stumble on.)

The convergence issue that can come up is that when you pass limits, in general
##\lim_{n \to \infty} E\big[X_n\big] \neq E\big[\lim_{n \to \infty} X_n\big]##
though dominated convergence can frequently be used to alleviate situations like this.
 
  • Like
Reactions: BWV

Want to reply to this thread?

"Question on problem 7 on July Challenge" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top