Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Conditional PDF

  1. Jun 10, 2010 #1
    Dear all,

    How can I find the the following conditional probability: [tex]f_Y(y/\gamma_f\leq \gamma_0)[/tex], where [tex]Y=\gamma_f+\text{min}(\gamma_h,\gamma_g)[/tex], where all gammas with subscripts f, g, h are i.i.d exponential random variables.

    Thanks in advance
     
  2. jcsd
  3. Jun 12, 2010 #2
    I would start out by working with cumulative distribution functions, and then take the derivative to get the density function. For instance, I would first find the cumulative distribution function for the random variable M = min(X_g, X_h), and then get the density from that. Then I would use the density for M together with the densities for X_f and X_0 to get the cumulative distribution function for Y given X_f <= X_0.

    I'm guessing you are also doing this problem with the exponential distribution first in anticipation of doing it with more complicated distributions later. As for your other question about moment generating functions, I don't know if there is a general way to relate MGF of order statistics to the MGF of the i.i.d random variables. It sounds difficult.
     
  4. Jun 12, 2010 #3

    Exactly. The problem is that I have worked it out and got a result that is differentf from the paper I am reading now. The authors divided the function into two intervals: [tex]\gamma\leq \gamma_0[/tex] and [tex]\gamma>\gamma_0[/tex].
     
  5. Jun 12, 2010 #4
    Hmm. Did you work out an answer for the case of exponentially distributed RVs? For what it is worth, I got

    [tex]f_Y(y|\gamma_f \leq \gamma_0) = \frac{4y}{\overline{\gamma}^2}e^{-\frac{2y}{\overline{\gamma}}}[/tex]
     
  6. Jun 13, 2010 #5
    Actually, the answer in the paper is:

    [tex]f_Y\left(y/\gamma_f\leq\gamma_0\right)=
    \begin{cases}
    \frac{\text{e}^{-y/\overline{\gamma}}-\text{e}^{-y/\overline{\gamma}_f}}{(\overline{\gamma}-\overline{\gamma}_f)\,\left(1-\text{e}^{-\gamma_0/\overline{\gamma}_f}\right)} & \text{if}\,\,\, y\leq\gamma_0\\
    \frac{1-\text{e}^{-\gamma_0(1/\overline{\gamma}_f-1/\overline{\gamma})}}{(\overline{\gamma}-\overline{\gamma}_f)\,\left(1-\text{e}^{-\gamma_0/\overline{\gamma}_f}\right)}\,\text{e}^{-y/\overline{\gamma}} & \text{if}\,\,\,y>\gamma_0
    \end{cases}[/tex]

    where [tex]\overline{\gamma}[/tex] is the mean of [tex]\text{min}(\gamma_h,\,\gamma_g)[/tex]

    Regards
     
  7. Jun 13, 2010 #6
    ***I just thought of this...does gamma_0 have the same distribution as the other gammas? I had assumed that in my answer above and in what follows below.*****

    Is that answer supposed to be for the exponential distribution? I don't know how they got that, but I'm not saying it is incorrect. Also, if these are identically distributed random variables, then [tex]\overline{\gamma}_f = \overline{\gamma}_0 = 2\overline{\gamma} [/tex], which would simplify those expressions. I also don't see why it needs to be separated into two intervals of y. Do the authors explain any of this or do they present the answer out of thin air?

    Here's how I got my answer. I am going to use X for the random variable so I don't have to keep typing out "gamma". Also, I will use the rate [tex]\lambda = 1/\overline{\gamma}[/tex], the inverse mean. So the density function for the [tex]X_i[/tex] is


    [tex]f_{X_i}(x) = \lambda e^{-\lambda x}[/tex]

    First of all, try to find the cumulative distribution function for Y given X_f <= X_0. This is

    [tex]P\{Y \leq y | X_f \leq X_0\} = \frac{P\{Y \leq y , X_f \leq X_0\}}{P\{X_f \leq X_0\}}[/tex]

    I assumed that X_f and X_0 are identically distributed, in which case the probability in the denominator is just 1/2. But if that assumption was wrong, I don't think it is too hard to modify. So basically I'm interested in finding [tex]P\{Y \leq y , X_f \leq X_0\}[/tex]. First I need to find the cumulative distribution function for the random variable M = min(X_h, X_g):

    [tex]P\{M \leq x\} = 1 - P\{M > x\} = 1 - P\{X_h>x\}P\{X_g>x\}[/tex]
    [tex]P\{M \leq x\} = 1 - e^{-2\lambda x}[/tex] using i.i.d.

    So the density for M is

    [tex]f_M(x) = 2\lambda e^{-2\lambda x}[/tex], exponential with mean half that of the X's. Now to find

    [tex]P\{X_f + M \leq y , X_f \leq X_0\}[/tex], I use the joint density for M, X_f, and X_0, which by independence is

    [tex]f_{X_0, X_f, M}(r,s,t) = f_{X_0}(r)f_{X_f}(s) f_{M}(t) = \lambda e^{-\lambda r} \lambda e^{-\lambda s} 2\lambda e^{-2\lambda t}[/tex]

    Finally, compute the integral

    [tex]P\{X_f + M \leq y , X_f \leq X_0\} = \int_{t=0}^y\int_{s=0}^{y-t}\int_{r=s}^{\infty}f_{X_0, X_f, M}(r,s,t)drdsdt = -\lambda y e^{-2\lambda y} + \frac{1}{2}(1-e^{-2\lambda y})[/tex]

    Differentiating this and multiplying by 2 (from my assumption that P{X_f <= X_0} = 1/2) gives the answer

    [tex]f_Y(y| X_f \leq X_0) = 4\lambda ^2 y e^{-2 \lambda y} [/tex]
     
  8. Jun 14, 2010 #7
    Execuse me, but [tex]\gamma_0[/tex] is just a constant, and the random variables [tex]\gamma_f,\,\gamma_h,\,\text{and}\,\gamma_g[/tex] are i.i.d, so [tex]\overline{\gamma}_f=\overline{\gamma}_h=\overline{\gamma}_g[/tex].

    I am thinking in another way:

    [tex]\text{Pr}\left[X_f+M\leq x,\,X_f\leq x_0\right]=\text{Pr}\left[2\,X_f+M\leq x+x_0\right][/tex]

    given that x and x_0 must be >= 0.

    I don't know if it is a valid approach or not.

    Regards
     
  9. Jun 14, 2010 #8
    Ah, okay. I was confused about that. But this makes the problem easier, I think.

    That was actually my first thought, too. But it is incorrect for the following reason: although

    [tex]X_f + M \leq y \text{ and } X_f \leq x_0[/tex] implies [tex]2X_f + M \leq y+x_0[/tex] ,

    the converse is not true. For example, you could have [tex]X_f + M = y + \epsilon[/tex], where epsilon is some positive number less than x_0. Then for values of X_f satisfying

    [tex]X_f \leq x_0 - \epsilon[/tex] ,

    it would be true that [tex]2X_f + M \leq y+x_0[/tex] .

    Now that I know x_0 is a constant, I see why the answer is broken up into two intervals. You get the probability [tex]Pr\[ X_f + M \leq y , X_f \leq x_0 \][/tex] by integrating the joint density [tex]f_{X_f, M}(r,s)[/tex] over a certain region in the r-s plane. The shape of that region depends on whether y is less than x_0 or greater than x_0. For the first case, the region is an equilateral right triangle. For the latter case, it is a quadrilateral (same triangle, but with one corner cut off).


    EDIT: "equilateral right triangle" should be "isoscoles right triangle": two sides equal, not all three. haha :redface:
     
    Last edited: Jun 14, 2010
  10. Jun 17, 2010 #9
    Were you able to get the author's answer?
     
  11. Jun 17, 2010 #10
    Not really. Have you?
     
  12. Jun 17, 2010 #11
    Yes. The joint density function for X_f and M is [tex]f_{X_f, M}(r,s) = \lambda_f e^{-\lambda_f r} \lambda_M e^{-\lambda_M s}[/tex], and

    [tex]P\{X_f+M \leq y, X_f \leq x_0\} = \int_{r=0}^{y}\int_{s=0}^{y-r}f_{X_f, M}(r,s)dsdr[/tex]

    for y <= x_0, and

    [tex]P\{X_f+M \leq y, X_f \leq x_0\} = \int_{r=0}^{x_0}\int_{s=0}^{y-r}f_{X_f, M}(r,s)dsdr[/tex]

    for y > x_0.

    M is also an exponential random variable, independent of X_f. When I converted back to the notation of the authors: [tex]\lambda_f = 1/\overline{\gamma_f}[/tex] and [tex]\lambda_M = 1/\overline{\gamma}[/tex], I got the same answers as you posted in post #5. EDIT: That is, of course, after taking the derivative w.r.t y and dividing by [tex]1-e^{-\lambda_f x_0}[/tex]

    NOTE: I am having trouble previewing the latex images, so if what I posted looks totally strange, just ignore it. I'll fix it when I get to a better computer.
     
    Last edited: Jun 17, 2010
  13. Jun 18, 2010 #12
    It makes sense. That is great. Thanks for your help.

    Regards
     
  14. Jun 18, 2010 #13
    You're welcome. Also, if you ever discover the general approach to that problem about moment generating functions, please post it. I'm curious now...
     
  15. Jun 18, 2010 #14
    Ok, I will. Thanks
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Conditional PDF
Loading...