Proof with regards to cumulative distribution function

  • Thread starter Eidos
  • Start date
  • #1
108
1
Hey guys

I'd like a steer in the right direction with this problem.
I would like to show that
[tex]P\{x_1\leq X \leq x_2\}=F_{X}(x_2)-F_{X}(x_1^{-})\quad(1)[/tex]

Where:
[tex]X[/tex] is a random variable.
[tex]F_{X}(x) \equiv P\{X \leq x \} [/tex] is its cumulative distribution function.

My notes only give an example (using dice) to show that this is true.

Generally
[tex]P\{x_1 < X \leq x_2\}=F_{X}(x_2)-F_{X}(x_1)\quad(2)[/tex]

and

[tex]P\{X = x_2\}=F_{X}(x_2)-F_{X}(x_2^{-})\quad (3)[/tex]
the latter of which is easy to prove.
I've been trying to rewrite (1) in terms of (2) & (3) but have had no success so far.
Any ideas would be most welcomed :smile:
 

Answers and Replies

  • #2
HallsofIvy
Science Advisor
Homework Helper
41,833
961
What definition of "cumulative distribution function" do you have?
 
  • #3
108
1
[tex]F_{X}(x) = \int_{-\infty}^{x}f_{X}(\mu) d\mu [/tex]

That limit from the left in (1) is so that the same is true whether we have a pdf or a pmf. With the pmf we would have a sum, not an integral. It matters in the discrete case whether we have 'less than equals to' or just 'less than' for the lower bound in our probability, but in the continuous case (assuming of course that our cdf is differentiable everywhere) it doesn't matter since [tex]x_0^{-}=x_0 [/tex].
 
Last edited:
  • #4
HallsofIvy
Science Advisor
Homework Helper
41,833
961
Okay, so from that definition,
[tex]P(x_1< X\le x_2)= \int_{x_1}^{x_2}f_X(\mu)d\mu[/tex]
[tex]= \int_{-\infty}^{x_2}f_X(\mu)d\mu- \int_{-\infty}^{x_1} f_X(\mu)d\mu[/tex]
[tex]= F(x_2)- F(x_1)[/tex]
 
  • #5
108
1
The only thing is though that we have not included the lower boundry x1 in our probability, but we have in the integral. How does that work, especially in the discrete case?

I know that the cdf is right continuous, and when we include the lower bound we take the next lowest discrete point than x1 which is x0.

That is [tex]P\{x_1 \leq X \leq x_2\}=F_{X}(x_2)-F_{X}(x_0)[/tex]

where: [tex]x_0=\lim_{x\rightarrow x_{1}^{-}}x[/tex]
 
  • #6
HallsofIvy
Science Advisor
Homework Helper
41,833
961
In the continuous case, it doesn't matter: the probability of a single data point is always 0:
[tex]P(x_1< X\le x_2)= P(x1\le X\le x_2)[/itex]

In the discrete case, there are two different probabilities:
[tex]P(x_1< X\le x_2)= P(x1\le X\le x_2)- P(x_1)[/itex]
 
  • #7
108
1
Cool thanks! :smile:

That last bit is exactly what I need.
 

Related Threads on Proof with regards to cumulative distribution function

  • Last Post
3
Replies
61
Views
8K
Replies
3
Views
2K
Replies
6
Views
7K
Replies
1
Views
2K
Replies
4
Views
4K
Replies
1
Views
1K
  • Last Post
Replies
4
Views
4K
Replies
2
Views
2K
Replies
4
Views
919
Top