1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wiener Process

  1. Oct 19, 2013 #1
    1. The problem statement, all variables and given/known data
    A Wiener Process W(t) is a stochastic process with:
    W(0) = 0
    Trajectories almost surely continuous
    Independent increases, that means, for all t1 ≤ t2 ≤ t3 ≤ t4, we have (W(t2) - W(t1)) is independent of (W(t4) - W(t3))
    For t ≤ s, (W(s) - W(t)) follows a normal centered law of variance (s-t).

    2. Relevant equations
    a) The process W(t) is markovian?
    b) For s,t > 0, what is the law of the random variable W(s) + W(s+t)?
    c) For u,v,w > 0, calculate E[W(u)W(u+v)W(u+v+w)]
    d) Calculate the autocovariance function of the process exp(-t)W(exp(2t))

    3. The attempt at a solution

    a) The process W(t) is markovian?

    A process is markovian if:
    P[X(tn) ∈ Bn | X(tn−1) ∈ Bn−1, ... , X(t1) ∈ B1] = P(X(tn) ∈ Bn | X(tn−1) ∈ Bn−1)

    Where Bi (I believe) is the Borel σ-algebra of the real numbers.

    By the data given:
    P[W(t2) - W(t1) ∈ B1 | W(t4) - W(t3) ∈ B1] = P[W(t2) - W(t1) ∈ B1
    For all t1 ≤ t2 ≤ t3 ≤ t4

    I don't know how to follow from now on

    The answer given is yes.

    b) For s,t > 0, what is the law of the random variable W(s) + W(s+t)?

    W(s) is gaussian and W(s+t) is gaussian, so W(s) + W(s+t) is gaussian

    Mean[W(s) + W(s + t)] = Mean[W(s)] + Mean[W(s+t)]

    Mean[W(s) - W(t)] = 0
    Mean[W(s)] = Mean[W(t)] which means that the mean is constant
    Mean[W(0)] = 0 → Mean[W(s)] = 0 → Mean[W(s) + W(s+t)] = 0

    Var[W(s) - W(t)], t ≤ s = s - t
    For t = 0:
    Var[W(s) - W(0)] = s - 0
    Var[W(s)] = s

    σ² = Var[W(s) + W(s+t)]
    σ² = E{[W(s) + W(s+t)]*[W(s+T) + W(s+t+T)]}
    σ² = E[W(s)W(s+T) + W(s)W(s+t+T) + W(s+t)W(s+T) + W(s+t)W(s+t+T)]
    σ² = Var[W(s)] + E[W(s)W(s+t+T)] + E[W(s+t)W(s+T)] + Var[W(s+t)]
    σ² = 2s + t + E[W(s)W(s+t+T)] + E[W(s+t)W(s+T)]

    W(s+t+T) = [W(s+t+T) - W(s+T)] + W(s+t)
    E[W(s)W(s+t+T)]
    E{[W(s+t+T) - W(s+T)]*W(s)} + E[W(s+t)W(s)]
    t1=0, t2=s, t3=s+T, t4=s+t+T, t1 ≤ t2 ≤ t3 ≤ t4 (Not sure of what I'm doing here, since T could be negative)
    E[W(s+t+T) - W(s+T)]*E[W(s)] + s
    s

    σ² = 3s + t + E[W(s+t)W(s+T)]

    I tried some similar substitutions but that doesn't seem to work in this last expectation

    The given answer is σ² = 4s + 1

    c) For u,v,w > 0, calculate E[W(u)W(u+v)W(u+v+w)]

    E[W(u)W(u+v)W(u+v+w)]
    W(u) = w1
    W(u+v) = w2
    W(u+v+w) = w3
    ∫ ∫ ∫ w1*w2*w3*p(w1,w2,w3) dw3 dw2 dw1

    p(w1,w2,w3) = p(w3|w2,w1)*p(w2|w1)*p(w1)
    Markovian process, w2 > w1 → p(w3|w2,w1) = p(w3|w2)

    ∫ ∫ ∫ w1*w2*w3*p(w3|w2)*p(w2|w1)*p(w1) dw2 dw2 dw1
    ∫ w1 ∫ w2 ∫ w3*p(w3|w2) dw3 p(w2|w1) dw2 p(w1) dw1

    ∫ w3*p(w3|w2) dw3 = E[w3|w2=w2]
    E[w3-w2] = 0
    E[w3] = E[w2]
    E[w3|w2=w2] = E[w2|w2=w2] = w2
    ∫ w3*p(w3|w2) dw3 = w2

    ∫ w1 ∫ w2² p(w2|w1) dw2 p(w1) dw1

    ∫ w2² p(w2|w1) dw2 = Rw[w2|w1]
    Rw[w2|w1] = var[w2|w1] + E[w2|w1=w1]²
    Rw[w2|w1] = v + w1²

    ∫ w1*(v + w1²) p(w1) dw1
    ∫ w1*v p(w1) dw1 + ∫ w1³ p(w1) dw1
    v*mean[w1] + ∫ w1³ p(w1) dw1

    v*mean[w1] = 0
    ∫ w1³*p(w1) dw1 = 0 because w1³*p(w1) is an odd function

    E[W(u)W(u+v)W(u+v+w) = 0

    The result matches the given answer, but I don't know if there are any other paths or if I all the resolution is correct

    PS: Calculations for conditional variance var[w2|w1]

    P[W(u+v) - W(u)] = N(0, v)
    P[W(u+v) - W(u)] = P[W(u+v) - W(u) | W(u) - W(0) = a]
    P[W(u+v) - W(u) | W(u) = a] = N(0,v)
    P[W(u+v) | W(u) = a] = N(a,v) - Variance doesn't change with summed constants like W(u) is here

    d) Calculate the autocovariance function of the process exp(-t)W(exp(2t))

    Here I could do it just by assuming beforehand that the process was Wide-Sense Stationary and the conditional probabilities were all gaussian

    X(t) = exp(-t)W(exp(2t))
    As before, the mean rests in zero, so the covariance is E[X(0)E(T)], T≥0
    X(0) = x1
    X(T) = x2

    ∫∫ x1*x2*p(x1,x2) dx1 dx2
    ∫∫ x1*x2*p(x1|x2)*p(x2) dx1 dx2
    ∫ x1 ∫ x2*p(x2|x1) dx2 p(x1) dx1

    p(x) = p(exp(-t)*W(exp(2t)))

    p(W(exp(2t))) = N(0,exp(2t))
    mean(a*x) = a*mean(x), variance(a*x) = a²*variance(x)
    p(x) = p(exp(-t)*W(exp(2t))) = N(0,1)

    p(x2|x1) = p( X(t2) | X(t1) = x1 )
    t2 = T, t1 = 0:
    p( X(T) | X(0) = x2 )
    p( exp(-T)*W(exp(2T)) | W(1) = x1 )

    p(W(exp(2T)) | W(1) = x1) = N(x1, exp(2T) - 1)
    p( exp(-T)*W(exp(2T)) | W(1) = x1 ) = N(x1*exp(-T), 1 - exp(-2T))

    p(x1) = N(0,1)
    p(x2|x1) = N(x1*exp(-T), 1 - exp(-2T))

    ∫ x1 ∫ x2*p(x2|x1) dx2 p(x1) dx1

    ∫ x2*p(x2|x1) dx2 = E[x2|x1] = x1*exp(-T)

    ∫ x1²*exp(-T)*p(x1) dx1
    exp(-T) ∫ x1²*p(x1) dx1 = exp(-T)*Var(x1) = exp(-T)

    The answer is exp(-|T|), but again, I don't know how to show that the covariance is symmetrical
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?



Similar Discussions: Wiener Process
  1. Refrigeration process (Replies: 0)

Loading...