Question on problem 7 on July Challenge

  • Context: Undergrad 
  • Thread starter Thread starter BWV
  • Start date Start date
  • Tags Tags
    Challenge
Click For Summary
SUMMARY

The discussion centers on the application of Jensen's Inequality in the context of stochastic calculus, specifically regarding the expectation of the integral of the square of a Brownian motion, denoted as \( W_s^2 \). Participants clarify that while Jensen's Inequality applies to non-linear functions, the linearity of integration allows for the interchange of the expectation operator and the integral. The key takeaway is that the equality condition of Jensen's Inequality holds due to the linearity of expectations, which permits moving the integral inside or outside the expectation operator without violating the inequality.

PREREQUISITES
  • Understanding of Brownian motion and its properties
  • Familiarity with stochastic calculus concepts
  • Knowledge of Jensen's Inequality and its applications
  • Basic principles of integration and expectation in probability theory
NEXT STEPS
  • Study the properties of Brownian motion and its integrability
  • Learn about the application of Jensen's Inequality in various contexts
  • Explore the concept of linearity of expectations in probability theory
  • Investigate the Dominated Convergence Theorem and its implications in stochastic calculus
USEFUL FOR

Mathematicians, statisticians, and students of stochastic calculus seeking to deepen their understanding of expectations, integrals, and the application of Jensen's Inequality in probability theory.

BWV
Messages
1,631
Reaction score
1,991
TL;DR
Question on problem 7 on July Challenge
Trying to follow and learn from the solution and did not want to clutter up the original thread

nuuskur said:
Fubini allows us to change order of integration, so we get
<br /> \mathbb EX = \mathbb E \left ( \int _0^t W_s^2ds\right ) = \int _0^t \mathbb E(W_s^2)ds = \int_0^t sds = \frac{t^2}{2}<br />

My naive question is why doesn't Jensen's Inequality prevent this step?

246277


Where you are swapping the expectation of a function for applying the function to the expectation which according to the inequality, the two expressions are equal only if the function is linear, which W^2 is not
 
Physics news on Phys.org
It's @Math_QED's question. Let's ask him.
 
BWV said:
Summary: Question on problem 7 on July Challenge

Trying to follow and learn from the solution and did not want to clutter up the original thread
My naive question is why doesn't Jensen's Inequality prevent this step?

View attachment 246277

Where you are swapping the expectation of a function for applying the function to the expectation which according to the inequality, the two expressions are equal only if the function is linear, which W^2 is not

I don't really understand your question. What is ##\varphi## here? Is it ##\varphi(x) = x^2?##. In that case, Jenssen says that

$$E\left[\left(\int_0^t W_s^2 ds\right)^2\right]= E[X^2] \geq (EX)^2 = \left(\int_0^t W_s^2 ds\right)^2$$

and I don't see how that contradicts the statement you mention.
 
Last edited by a moderator:
Math_QED said:
Are you sure this is correct? By definition, if XX is a random variable on a probability space (Ω,F,P)(Ω,F,P), then


E(Y):=∫ΩY(ω)P(dω)E(Y):=∫ΩY(ω)P(dω)​
Agreed, my initial post didn't make much sense at all. Post-edit it should be ok.

Math_QED said:
I don't really understand your question. What is φφ here?
My guess is he assumed I used Jensen's inequality to arrive at whatever I arrived at, the \varphi is supposed to be a convex map, which one, I can't say.
 
  • Like
Likes   Reactions: member 587159
Math_QED said:
Now it is correct, though I'm not sure if this integral exists if and only if ##\int_\Omega Y dP## exists. It is definitely true that if the expectation exists and is finite, then your integral exists and is equal to the expectation.
Right right. The equality holds precisely when Y has a probability density function. My probability theory has several layers of rust on it by now :(
 
nuuskur said:
Right right. The equality holds precisely when Y has a probability density function. My probability theory has several layers of rust on it by now :(

Now that I think about it, my previous reply was incorrect. Isn't the correct formula

##E(Y) = \int_{\mathbb{R}} yf_Y(y) dy##

where ##f_Y## is a density function of ##Y## (assuming it exists!). The formula you listed in post #3 can't be correct for the simple reason that ##Y(s)## with ##s \in \mathbb{R}## does not make sense because ##Y## is not necessarily defined on the real numbers. I suggest you edit the post with the correct definition ##EY = \int_\Omega Y dP## and then also the rest of the answer.

Indeed, using this definition

$$E(X) = \int_\Omega X dP = \int_\Omega \left(\int_0^t W_s^2 ds\right) dP = \int_0^t \left( \int_\Omega W_s^2 dP\right) ds = \int_0^t E[W_s^2] ds$$

and the third equality follows by Fubini.
 
Last edited by a moderator:
  • Like
Likes   Reactions: nuuskur
@Math_QED Agreed again. My technique is garbage and sloppy. Will try to mend my ways.
 
Math_QED said:
I don't really understand your question. What is ##\varphi## here? Is it ##\varphi(x) = x^2?##. In that case, Jenssen says that

$$E\left[\left(\int_0^t W_s^2 ds\right)^2\right]= E[X^2] \geq (EX)^2 = \left(\int_0^t W_s^2 ds\right)^2$$

and I don't see how that contradicts the statement you mention.
Ok forget it then, thought I could learn something but too far outside my pay grade
 
BWV said:
Ok forget it then, thought I could learn something but too far outside my pay grade

Please do ask. I'm just trying to understand your confusion so I can give you my best answer :).
 
  • #10
The question was basic - why doesn't Jensen apply when moving the expectation inside / outside of the integral?

Im sure my confusion stems from what is getting integrated over what -the 'function' here which is being moved is integrating over the variance (time)

Not sure how to interpret integrating X over t, have some experience with stoc calculus but its always derivatives of X as in Ito's Lemma
 
  • #11
BWV said:
Ok forget it then, thought I could learn something but too far outside my pay grade

maybe the way to think of it is define ##g## by
##g(W_s)= W_s^2##

so ##g## is strictly convex but

##h_t(X) = \int_0^t X dx##
is linear -- i.e. homogenous with respect to scaling and additivity. Linear functions are convex and concave. So the problem is

##E\big[ h_t(g(W_s))\big]= h_t(E\big[g(W_s))\big]##
by linearity (subject to convergence considerations -- I'm fairly confident you could justify the interchange on dominated convergence as well )

- - - -
what Jensen says is
## h_t\Big(E\big[g(W_s)\big]\Big) \gt h_t\Big(g(E\big[W_s\big])\Big)##
because ##g## is strictly convex (and ##W_s## isn't constant with probability 1)

or more simply, for any ##s \gt 0##
##E\big[g(W_s)\big] \gt g(E\big[W_s\big])##
then we sum or integrate over this point-wise bound to recover the above
 
  • Like
Likes   Reactions: BWV
  • #12
Thanks - so because integration is linear, it meets an equality condition of Jensen & it can be moved in or outside of the expectation operator
 
  • #13
BWV said:
Thanks - so because integration is linear, it meets an equality condition of Jensen & it can be moved in or outside of the expectation operator

if we don't worry too much about convergence issues then I suppose you can say this. The more basic idea is Linearity of Expectations. (Then bring up Jensen for functions that aren't linear/affine but are known to be convex or concave)

I'm told that Brownian motion is Riemann integrable, and in any case an important way to think about very complicated stuff is with countable sets when possible (not always possible but you should try) and then countable sets with finite ones wherever possible.

So write out the Riemann Integral which I assume exists

##S_n := \frac{1}{n}\sum_{i=1}^n g(W_{\frac{it}{n}}) \approx \int_{0}^t g(W_s) ds##

and by linearity of expectations, for any natural number ##n##, you always have

##E\big[S_n\big] = E\big[\frac{1}{n}\sum_{i=1}^n g(W_{\frac{it}{n}}) \big] = \frac{1}{n}\sum_{i=1}^n E\big[g(W_{\frac{it}{n}}) \big]##
(this always holds, irrespective of whether they are independent, something a lot of people stumble on.)

The convergence issue that can come up is that when you pass limits, in general
##\lim_{n \to \infty} E\big[X_n\big] \neq E\big[\lim_{n \to \infty} X_n\big]##
though dominated convergence can frequently be used to alleviate situations like this.
 
  • Like
Likes   Reactions: BWV

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 66 ·
3
Replies
66
Views
7K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 137 ·
5
Replies
137
Views
20K
  • · Replies 101 ·
4
Replies
101
Views
19K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 69 ·
3
Replies
69
Views
9K
  • · Replies 61 ·
3
Replies
61
Views
12K