- #36
BWV
- 1,455
- 1,769
#7
Isnt #7 just the variance of W?
Var(W)=E[W^2]-E^2[W]
E[W]=E^2[W]=0
So
var(w)=E[W^2]=t
Var(W)=E[W^2]-E^2[W]
E[W]=E^2[W]=0
So
var(w)=E[W^2]=t
Last edited:
BWV said:#7
Isnt #7 just the variance of W?
Var(W)=E[W^2]-E^2[W]
E[W]=E^2[W]=0
So
var(w)=E[W^2]=t
lpetrich said:Problem 8, partial solution
It is necessary to find the Galois group for the splitting field of ##x^4 - 2x^2 - 2## over Q.
The roots of that equation are ##\pm \sqrt{1 \pm \sqrt{3} } ##. Their four symmetry operations are (identity), (reverse outer square-root sign), (reverse inner square-root sign), and (reverse both square-root signs). The group of these operations is rather obviously ##Z_2 \times Z_2##.
nuuskur said:Not really my area, but here goes nothing. Based on Brownian motion characterisation under section titled "Mathematics"
Put [itex]X := \int _0^t W_s^2 ds[/itex]. Based on conditions 1 and 4 of the characterisation we have [itex]W_s \sim \mathcal N(0,s)[/itex], where [itex]s\geq 0[/itex]. Therefore
[tex]
s = \mbox{var}(W_s) = \mathbb E (W_s - \mathbb EW_s)^2 = \mathbb E(W_s^2).
[/tex]
Fubini allows us to change order of integration, so we get
[tex]
\mathbb EX = \mathbb E \left ( \int _0^t W_s^2ds\right ) = \int _0^t \mathbb E(W_s^2)ds = \int_0^t sds = \frac{t^2}{2}
[/tex]
Initially, P7 asked for variance, which turns out to be
[tex]
\mbox{var}(X) = \mathbb E (X - \mathbb EX)^2 = \mathbb E\left (X - \frac{t^2}{2}\right )^2 = \mathbb E(X^2) - \frac{t^4}{4}
[/tex]
So we need to calculate the second moment of [itex]X[/itex]. I'm not sure at the moment what's happening here.
nuuskur said:It gets weird, there has to be some kind of algebraic trick involved, which I can't think of.
We could try
[tex]
\mathbb E(W_uW_v)^2 = \mbox{var}(W_uW_v) + \mathbb E^2 (W_uW_v)
[/tex]
This brings more complications, though. By linearity we get
[tex]
v\leq u \implies u-v=\mbox{var}(W_u-W_v) = \mathbb E(W_u-W_v)^2 = u - 2\mathbb E(W_uW_v) + v
[/tex]
from which
[tex]
v\leq u \implies \mathbb E(W_uW_v) = v
[/tex]
not sure how that's helpful, though.
nuuskur said:Here's what comes to mind
In case of independent random variables [itex]E(XY) = E(X)E(Y)[/itex]. By first expanding the expression inside expected value:
[tex]
\begin{align*}
&[(W_u-W_v)^2 + 2(W_u-W_v)W_v + W_v^2]W_v^2 \\
=&(W_u-W_v)^2W_v^2 + 2(W_uW_v -W_v^2)W_v^2 + W_v^4 \\
=&(W_u-W_v)^2W_v^2 + 2W_uW_v^3 - W_v^4
\end{align*}
[/tex]
To get the mgf of Brownian, one computes
[tex]
M_W(x) = \mathbb E(e^{xW_s}) = \mbox{exp}\left ( \frac{1}{2}x^2s\right )
[/tex]
Fourth derivative (w.r.t ##x##) at [itex]x=0[/itex] gives us the fourth moment, which is (if I calculated correctly) [itex]\mathbb E(W_v^4) = 3v^2[/itex].
Since [itex]W_u-W_v, W_v[/itex] are independent, squaring them preserves independence, so
[tex]
\mathbb E(W_u-W_v)^2W_v^2 = \mathbb E(W_u-W_v)^2\mathbb E(W_v^2) = (u-v)v
[/tex]
Not sure what is happening with the middle part, though. Does it vanish?
nuuskur said:Hmm, does it hold that [itex]X,Y[/itex] independent implies [itex]X, f(Y)[/itex] independent? I think so, now that I think about it. The sigma algebra generated by [itex]f[/itex] can only get smaller.
nuuskur said:I may have computed something incorrectly. I get that [itex]\mbox{var}(X)[/itex] is negative.
nuuskur said:I was assuming
[tex]
\mbox{var}(X) = \mathbb E(X^2) - \frac{t^4}{4} = \int_0^t\int _0^t ((u-v)v - 3v^2) dudv - \frac{t^4}{4}
[/tex]
Having my doubts about the integrand, most likely incorrectly computed the fourth moment.
nuuskur said:*smacks forehead*
[tex]
u\geq v \implies \mathbb E(W_u-W_v)^2\mathbb E(W_v^2) = (u-v)v,
[/tex]
because the second moment of [itex]W_u-W_v[/itex] is its variance. By symmetry it should hold
[tex]v\geq u \implies \mathbb E(W_u-W_v)^2 = v-u
[/tex]
So the inner integral should work out as follows,
[tex]
\int _0^t \mathbb E(W_u^2W_v^2) du = \int _0^v ((v-u)v - 3v^2)du + \int _v^t ((u-v)v -3v^2)du
[/tex]
Still not right :/ I am braindead, making some elementary mistake.
Suppose ##u \geq v##. Thennuuskur said:Curious, how you arrive at your expression for [itex]\mathbb E(W_u^2W_v^2)[/itex].
nuuskur said:By symmetry
[tex]
v\geq u \implies \mathbb E(W_u^2W_v^2) = (v-u)v + 3v^2 = 4v^2 -uv.
[/tex]
What am I missing?
nuuskur said:Graah. I am lost again.
[tex]
v\geq u \implies \mathbb E(W_u-W_v)^2 = \mathbb E(W_v-W_u)^2 = v-u,
[/tex]
no?
nuuskur said:Nevermind, now I understand what you are applying symmetry to.
nuuskur said:Eventually, the variance ought to be$$
\mbox{var}(X) = \mathbb E(X^2) - \frac{t^4}{4} = \frac{7}{12}t^4$$
P.S.: I just saw that one post has been edited, so maybe this is the reason for this apparently contradicting quotations. If this should have been the case, please mark edits as such, so that the thread will remain readable and contradictions as above can be explained to readers.Math_QED said:But unlike you I get
$$ E[X^2]=7t^4/12E[X^2]=7t^4/12$$
nuuskur said:The polynomial is irreducible.
nuuskur said:Put [itex]\alpha := \sqrt{1-\sqrt{3}}, \beta := \sqrt{1+\sqrt{3}}[/itex]. Find
[tex]
[\mathbb Q(\alpha,\beta),\mathbb Q] = [\mathbb Q(\alpha,\beta),\mathbb Q(\beta)] [\mathbb Q(\beta),\mathbb Q] = 2[\mathbb Q(\alpha,\beta),\mathbb Q(\beta)]
[/tex]
where the degree of [itex]\mathbb Q(\beta) /\mathbb Q[/itex] is clearly [itex]2[/itex], since [itex]1,\beta[/itex] are linearly independent over [itex]\mathbb Q[/itex].
nuuskur said:Wait, are we supposed to compute modulo the polynomial? It's irreducible by Eisenstein (take ##p=2##).
So we'd need ##{1,\beta,\beta ^2,\beta ^3,\beta ^4}## to span ##\mathbb Q(\beta)##?