Recent content by jason1995

  1. J

    Statistics: E(X) = Integral(0 to infinity) of (1-F(x))dx

    \begin{align*} E[X] &= E\bigg[\int_0^X 1\,dx\bigg]\\ &= E\bigg[\int_0^\infty 1_{\{X>x\}}\,dx\bigg]\\ &= \int_0^\infty E[1_{\{X>x\}}]\,dx\\ &= \int_0^\infty P(X > x)\,dx\\ &= \int_0^\infty (1 - F(x))\,dx \end{align*} By the way, this formula is true no matter what kind of random...
  2. J

    Autocorrelation of white noise.

    White noise cannot be defined rigorously in any of these ways. White noise does not exist as a stochastic process, in the same way that the Dirac delta function does not exist as a function. There is no (measurable) continuous time stochastic process X that satisfies E[X(t)] = 0, var(X(t)) =...
  3. J

    PDF of function of 3 continuous, uniform random variables?

    We can do this the same way. If w\in(0,1), then \begin{align*} P(W\le w) &= P(XY \le w)\\ &= E[P(XY\le w \mid Y)]\\ &= E\left[{P\left({X\le\frac wY\mid Y}\right)}\right]. \end{align*} If Y\le w, then the probability is 1; otherwise, it is w/Y. Thus, \begin{align*} P(W\le w) &=...
  4. J

    PDF of function of 3 continuous, uniform random variables?

    Fix x in (0,1). We could start with P(WZ ≤ x) = E[P(WZ ≤ x | W)]. Since Z and W are independent, we can calculate P(WZ ≤ x | W) by treating W as a constant. In this case, if W > x, then the probability is 0. Otherwise, WZ ≤ x iff Z ≥ ln(x)/ln(W), which has probability 1 - ln(x)/ln(W)...
  5. J

    Expectation of a function of a continuous random variable

    Yes. Consider the simplest case, where g(x) = 1 if x is in [a,b] and 0 otherwise. Then \begin{align*} E[W] &= 1\cdot P(W = 1) + 0\cdot P(W = 0)\\ &= P(g(X) = 1)\\ &= P(X\in [a,b])\\ &= \int_a^b f_X(x)\,dx. \end{align*} On the other hand, the formula also gives \int_{-\infty}^\infty...
Back
Top