Definite integral over random interval

1. Oct 7, 2009

benjaminmar8

Hi, all,

assuming a and b are random variables and their pdf f(a) and f(b) are known. then, how do I solve for the definite integral given as $$v=\int\limits_{a}^{b} g(x) dx$$, where g(x) is a function of x? or, how do I solve the pdf of v?

Thanks a lot..

2. Oct 7, 2009

EnumaElish

I think the answer is "use the fundamental theorem of calculus." Operationally, suppose each of a and b is Bernoulli with probabilities p and q: Pr[a = a1] = 1 - Pr[a = a2] = p, Pr[b = b1] = 1 - Pr[b = b2] = q.

Then:
v = G[b1] - G[a1] with prob = pq,
v = G[b2] - G[a1] with prob. p(1-q)
v = G[b1] - G[a2] with prob. (1-p)q
v = G[b2] - G[a2] with prob (1-p)(1-q)

where "the integral of g from a to b" = G - G[a].

3. Oct 8, 2009

bpet

Another approach is to first work out the cdf P[v<=y]. To do this (assuming a and b are independent) it might be helpful to sketch the 2d region of points (a,b) where G-G[a]<=y.

4. Oct 9, 2009

winterfors

If we define that
$$y(a,b)=\int\limits_{b=-\infty}^{\infty} \int\limits_{a}^{b} g(x) dx$$
we can state the pdf over the joint space of variables a, b and v as
$$f(a,b,v) = \delta\left(v-y(a,b)\right)f(a)f(b)$$
where $\delta$ is the Dirac delta function.

The pdf $f(v)$ of v is then the marginal of $f(a,b,v)$ with respect to a and b
$$f(v) = \int\limits_{a=-\infty}^{\infty} \int\limits_{b=-\infty}^{\infty} \delta\left(v-y(a,b)\right) f(a)f(b) db da$$

The integral of g(x) can as stated above be evaluated using its primitive G(x):
$$\int\limits_{a}^{b} g(x) dx = G(b)-G(a) \quad \texttt{if} \quad b \geq a$$,
$$\int\limits_{a}^{b} g(x) dx = G(a)-G(b) \quad\texttt{if} \quad a \geq b$$

We thus need to account for the two possible cases:
$$f(v) = \int\limits_{a=-\infty}^{\infty} \int\limits_{b=-\infty}^{a} \delta\left(v-G(a)+G(b)\right) f(a)f(b) db da +\int\limits_{a=-\infty}^{\infty} \int\limits_{b=a}^{\infty} \delta\left(v-G(b)+G(a)\right) f(a)f(b) db da$$

Last edited: Oct 9, 2009