What does the continuity of a distribution function mean?

Eclair_de_XII
Messages
1,082
Reaction score
91

Homework Statement


"Show that if ##P(X=c)>0##, for some ##c\in \mathbb{R}##, then the distribution function ##F_X(x)=P(X\leq x)## is discontinuous at ##c##. Is the converse true?"

Homework Equations


Continuity of a distribution function: ##\lim_{\epsilon \rightarrow 0}F_X(x+\epsilon)=F_X(x)##

The Attempt at a Solution


I'm thinking that the graph of ##F_X(x)## jumps up at ##x=c##. So far, I have:

##\lim_{\epsilon \rightarrow 0}F_X(c+ \epsilon)\\
=\lim_{\epsilon \rightarrow 0} P(X\leq c + \epsilon)\\
=\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)+\lim_{\epsilon \rightarrow 0}P(X=c+\epsilon)\\
=P(X\leq c)+P(X=c)>P(X\leq c)##

I'm not sure how to justify the step made in the third line, stating that: ##\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)=P(X\leq c)##.
 
Physics news on Phys.org
Let's start with the relevant equations as they are most important

Eclair_de_XII said:

Homework Equations


Continuity of a distribution function: ##\lim_{\epsilon \rightarrow 0}F_X(x+\epsilon)=F_X(x)##
I don't think this is correct. By construction, the convention is for distribution functions (CDFs) to be right continuous, so
##\lim_{\epsilon \rightarrow 0^+}F_X(x+\epsilon)=F_X(x)##

but you may have (countably many) discontinuities from the left. I'd start with a discrete random variable, like say a Bernouli and work out an example or two.

- - - -
sometimes it isn't worth it to nitpick left vs right vs actual limits, but I think that is the point of the entire exercise...?

In fact the difference between right and left limits immediately says your attempt at a solution cannot be correct -- i.e. ##P(X=c)>0## tells you that
##\lim_{\epsilon \rightarrow 0}## doesn't exist around ##c##. It may be that you are constraining to positive numbers, i.e. ##\epsilon \gt 0## but (a) the post doesn't say this and (b) it's quite misleading to call this a general limit as it is 'merely' a limit from the right and hence you're only doing (the wrong) half of the problem. (More of a nitpick: I think it'd be better to use ##\delta## here for a typical ##\delta, \epsilon## approach to continuity. )

Long story short: CDFs are nice (and monotone non-decreasing) -- just consider the difference of taking a limit from the left vs the right, and in particular what a jump discontinuity looks like
 
Last edited:
StoneTemplePython said:
I'd start with a discrete random variable, like say a Bernouli and work out an example or two.

I worked something out with:

##f_X(x)=\begin{cases}
1-p,\text{ if } x=0\\
p, \text{ if } x=1\\
0, \text{ if otherwise} \end{cases}##

and got:

##F_X(x)=\begin{cases} 0,\text{ if } x<0\\
1-p,\text{ if } 0\leq x < 1\\
1, \text{ if } x\geq 1\end{cases}##

I drew a picture...

fZx9D6v.png


Basically, if I derived this CDF correctly, then ##\lim_{\delta \rightarrow 0^+}F(c+\delta)= \lim_{\delta \rightarrow 0^-}F(c+\delta)+P(X=c)##. So ##\lim_{\delta \rightarrow 0^+}F(c+\delta)=\lim_{\delta \rightarrow 0^-}F(c+\delta)+P(X=c)>\lim_{\delta \rightarrow 0^-}F(c+\delta)##, and so they're not equal; continuity fails to hold at ##x=c##, as a result. I'm not sure if I'm allowed to state that a jump discontinuity of size ##P(X=c)## is there, though. I know it's there for a Bernoulli distribution, but I'm worried that I may have to show that a gap of this size actually exists for general discontinuous distribution functions.
 

Attachments

  • fZx9D6v.png
    fZx9D6v.png
    2.4 KB · Views: 622
Last edited:
  • Like
Likes StoneTemplePython
Eclair_de_XII said:

Homework Statement


"Show that if ##P(X=c)>0##, for some ##c\in \mathbb{R}##, then the distribution function ##F_X(x)=P(X\leq x)## is discontinuous at ##c##. Is the converse true?"

Homework Equations


Continuity of a distribution function: ##\lim_{\epsilon \rightarrow 0}F_X(x+\epsilon)=F_X(x)##

The Attempt at a Solution


I'm thinking that the graph of ##F_X(x)## jumps up at ##x=c##. So far, I have:

##\lim_{\epsilon \rightarrow 0}F_X(c+ \epsilon)\\
=\lim_{\epsilon \rightarrow 0} P(X\leq c + \epsilon)\\
=\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)+\lim_{\epsilon \rightarrow 0}P(X=c+\epsilon)\\
=P(X\leq c)+P(X=c)>P(X\leq c)##

I'm not sure how to justify the step made in the third line, stating that: ##\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)=P(X\leq c)##.
In terms of ##F_X##, how would you evaluate ##P(c-\epsilon < X \leq c + \epsilon)## for small ##\epsilon > 0?## What happens if you take ##\epsilon \downarrow 0?##
 
Eclair_de_XII said:
I worked something out with:

##f_X(x)=\begin{cases}
1-p,\text{ if } x=0\\
p, \text{ if } x=1\\
0, \text{ if otherwise} \end{cases}##

and got:

##F_X(x)=\begin{cases} 0,\text{ if } x<0\\
1-p,\text{ if } 0\leq x < 1\\
1, \text{ if } x\geq 1\end{cases}##

I drew a picture...

View attachment 233214

Basically, if I derived this CDF correctly, then ##\lim_{\delta \rightarrow 0^+}F(c+\delta)= \lim_{\delta \rightarrow 0^-}F(c+\delta)+P(X=c)##. So ##\lim_{\delta \rightarrow 0^+}F(c+\delta)=\lim_{\delta \rightarrow 0^-}F(c+\delta)+P(X=c)>\lim_{\delta \rightarrow 0^-}F(c+\delta)##, and so they're not equal; continuity fails to hold at ##x=c##, as a result. I'm not sure if I'm allowed to state that a jump discontinuity of size ##P(X=c)## is there, though. I know it's there for a Bernoulli distribution, but I'm worried that I may have to show that a gap of this size actually exists for general discontinuous distribution functions.

I liked the picture here. Bernouli's are instructive. You can now apply this for ##P(X=c)=\alpha>0##. To do it directly: you can re-run the same argument showing that a continuity argument breaks on more general cdf's -- i.e. you cannot find a satisfactory neighborhood of ##\delta \gt 0## at point ##c## when someone selects ##\epsilon\gt 0## that is sufficiently small (how small? relate it to ##\alpha## in some way)
 
Last edited:
Eclair_de_XII said:

Homework Statement


"Show that if ##P(X=c)>0##, for some ##c\in \mathbb{R}##, then the distribution function ##F_X(x)=P(X\leq x)## is discontinuous at ##c##. Is the converse true?"

Homework Equations


Continuity of a distribution function: ##\lim_{\epsilon \rightarrow 0}F_X(x+\epsilon)=F_X(x)##

The Attempt at a Solution


I'm thinking that the graph of ##F_X(x)## jumps up at ##x=c##. So far, I have:

##\lim_{\epsilon \rightarrow 0}F_X(c+ \epsilon)\\
=\lim_{\epsilon \rightarrow 0} P(X\leq c + \epsilon)\\
=\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)+\lim_{\epsilon \rightarrow 0}P(X=c+\epsilon)\\
=P(X\leq c)+P(X=c)>P(X\leq c)##

I'm not sure how to justify the step made in the third line, stating that: ##\lim_{\epsilon \rightarrow 0}P(X<c+\epsilon)=P(X\leq c)##.

Basically, you do not need to justify such statements; they are immediate consequences of the "axioms of probability". I don't know what you were taught, but most semi-advanced treatments talk about "continuity of probability". This means that if ##\{A_n\}## is a monotone sequence of events (that is, either the ##A_n## increase --- ##A_n \subset A_{n+1}## --- or they decrease --1 ##A_n \supset A_{n+1}##) then $$\lim_{n \to \infty} P(A_n) = P\left( \lim_{n \to \infty} A_n \right).$$
Here, ##\lim_n A_n = \cap_n A_n## if the ##A_n## decrease, or ##\lim_n A_n = \cup_n A_n## if the ##A_n## increase.

All axiomatic treatments of probability contain at least the notion of finite additivity:
$$P\left( \bigcup_{k=1}^n A_k \right) = \sum_{k=1}^n P(A_k) \hspace{4em}(1)$$ for any disjoint sequence of events ##\{A_k\}.## Some axiomatic presentations contain the axiom of countable additivity, which states that (1) holds as well for an infinity disjoint sequence and with ##n = \infty## in the union and in the sum.

Other treatments have the axiom of finite additivity plus the axiom of continuity of probability. However, how you do it does not matter, because or the following.

Theorem. The following are equivalent:
(1) Countable additivity.
(2) Finite additivity plus continuity of probability.

So, for example: ##\{X=c\} = \lim_{n \to \infty} \{c - 1/n < X \leq c\},## so continuity of probability guarantees $$P(X=c) = \lim_{n \to \infty} F_X(c) -F_X(c - 1/n) = F_X(c)-F_X(c-).$$
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top