Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proof with regards to cumulative distribution function

  1. Jul 27, 2008 #1
    Hey guys

    I'd like a steer in the right direction with this problem.
    I would like to show that
    [tex]P\{x_1\leq X \leq x_2\}=F_{X}(x_2)-F_{X}(x_1^{-})\quad(1)[/tex]

    Where:
    [tex]X[/tex] is a random variable.
    [tex]F_{X}(x) \equiv P\{X \leq x \} [/tex] is its cumulative distribution function.

    My notes only give an example (using dice) to show that this is true.

    Generally
    [tex]P\{x_1 < X \leq x_2\}=F_{X}(x_2)-F_{X}(x_1)\quad(2)[/tex]

    and

    [tex]P\{X = x_2\}=F_{X}(x_2)-F_{X}(x_2^{-})\quad (3)[/tex]
    the latter of which is easy to prove.
    I've been trying to rewrite (1) in terms of (2) & (3) but have had no success so far.
    Any ideas would be most welcomed :smile:
     
  2. jcsd
  3. Jul 27, 2008 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    What definition of "cumulative distribution function" do you have?
     
  4. Jul 27, 2008 #3
    [tex]F_{X}(x) = \int_{-\infty}^{x}f_{X}(\mu) d\mu [/tex]

    That limit from the left in (1) is so that the same is true whether we have a pdf or a pmf. With the pmf we would have a sum, not an integral. It matters in the discrete case whether we have 'less than equals to' or just 'less than' for the lower bound in our probability, but in the continuous case (assuming of course that our cdf is differentiable everywhere) it doesn't matter since [tex]x_0^{-}=x_0 [/tex].
     
    Last edited: Jul 27, 2008
  5. Jul 27, 2008 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Okay, so from that definition,
    [tex]P(x_1< X\le x_2)= \int_{x_1}^{x_2}f_X(\mu)d\mu[/tex]
    [tex]= \int_{-\infty}^{x_2}f_X(\mu)d\mu- \int_{-\infty}^{x_1} f_X(\mu)d\mu[/tex]
    [tex]= F(x_2)- F(x_1)[/tex]
     
  6. Jul 27, 2008 #5
    The only thing is though that we have not included the lower boundry x1 in our probability, but we have in the integral. How does that work, especially in the discrete case?

    I know that the cdf is right continuous, and when we include the lower bound we take the next lowest discrete point than x1 which is x0.

    That is [tex]P\{x_1 \leq X \leq x_2\}=F_{X}(x_2)-F_{X}(x_0)[/tex]

    where: [tex]x_0=\lim_{x\rightarrow x_{1}^{-}}x[/tex]
     
  7. Jul 27, 2008 #6

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    In the continuous case, it doesn't matter: the probability of a single data point is always 0:
    [tex]P(x_1< X\le x_2)= P(x1\le X\le x_2)[/itex]

    In the discrete case, there are two different probabilities:
    [tex]P(x_1< X\le x_2)= P(x1\le X\le x_2)- P(x_1)[/itex]
     
  8. Jul 28, 2008 #7
    Cool thanks! :smile:

    That last bit is exactly what I need.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Proof with regards to cumulative distribution function
Loading...