The limit of random variable is not defined

Click For Summary
The discussion centers around proving that the limit of the sum of independent identically distributed random variables, taking values -1 and +1, does not exist almost surely. The approach involves using the Central Limit Theorem, which shows that the sum converges in distribution to a normal random variable with mean 0 and variance proportional to the number of terms. Participants explore the relationship between convergence in probability and almost sure convergence, noting that almost sure convergence is a stronger condition. The conversation also touches on the implications of random walk properties, particularly that a random walk returns to the origin infinitely often with probability 1, suggesting that the probability of diverging to infinity is zero. The conclusion emphasizes the need for rigorous proof that the set of paths diverging to infinity has measure zero.
Mike.B
Messages
12
Reaction score
0
Let ##X_i## are i.i.d. and take -1 and +1 with probability 1/2 each. How to prove ##\lim_{n\rightarrow\infty}{\sum_{i=1}^{n}{X_i} }##does not exsits (even infinite limit) almost surely.
My work:
I use cauchy sequence to prove it does not converge to a real number.
But I do not how to prove it does not converge to infinity. Can some one give hints?
 
Last edited:
Physics news on Phys.org
Denote ##\sum_{k=1}^j X_k## by ##S_j##

Then If the sum ##S_j## converges in probability to infinity then
$$\forall M>0\ \forall \epsilon>0:\ \exists N>0 \textrm{ such that } j\geq N\Rightarrow Pr(S_j>M)>1-\epsilon\ \ \ \ \ (1)$$

For this not to be the case we negate the previous line to obtain:

$$\exists M>0\ \exists \epsilon>0:\ \forall N>0: \exists j\geq N\Rightarrow Pr(S_j>M)<1-\epsilon\ \ \ \ \ (2)$$

That's what has to be proved. One way to do that (not necessarily the quickest) is to use the central limit theorem. Note that ##S_j=2B_j-j## where ##B_j## is binomial with parameters ##j,0.5##, which has mean ##\frac{j}{2}## and variance ##\frac{j}{4}##. The Central Limit Theorem tells us that ##B_j## converges in probability to a random variable with distribution ##N(\frac{j}{2},\frac{j}{4})##, so that ##S_j## converges in probability to a RV ##Z_j## with distn ##N(0,j)##.

It's easy enough to prove that the sequence of RVs ##(Z_j)_{j=1}^\infty## satisfies (2). You then just need to put that together with the convergence in probability of ##S_j## to ##Z_j## to get the required result.
 
Last edited:
  • Like
Likes Mike.B
andrewkirk said:
Denote ##\sum_{j=1}^n X_j## by ##S_j##

Then If the sum ##S_j## converges in probability to infinity then
$$\forall M>0\ \forall \epsilon>0:\ \exists N>0 \textrm{ such that } j\geq N\Rightarrow Pr(S_j>M)>1-\epsilon\ \ \ \ \ (1)$$

For this not to be the case we negate the previous line to obtain:

$$\exists M>0\ \exists \epsilon>0:\ \forall N>0: \exists j\geq N\Rightarrow Pr(S_j>M)<1-\epsilon\ \ \ \ \ (2)$$

That's what has to be proved. One way to do that (not necessarily the quickest) is to use the central limit theorem. Note that ##S_j=2B_j-j## where ##B_j## is binomial with parameters ##j,0.5##, which has mean ##\frac{j}{2}## and variance ##\frac{j}{4}##. The Central Limit Theorem tells us that ##B_j## converges in probability to a random variable with distribution ##N(\frac{j}{2},\frac{j}{4})##, so that ##S_j## converges in probability to a RV ##Z_j## with distn ##N(0,j)##.

It's easy enough to prove that the sequence of RVs ##(Z_j)_{j=1}^\infty## satisfies (2). You then just need to put that together with the convergence in probability of ##S_j## to ##Z_j## to get the required result.
##\sum_{j=1}^n X_j## by ##S_j##?
 
Make that ##S_n##. I'll correct it above.
 
What is the relation between almost surely and in probability? How to use here?
 
Last edited:
That's correct. Almost surely is a stronger type of convergence than in probability.
 
andrewkirk said:
That's correct. Almost surely is a stronger type of convergence than in probability.
I have difficulty about combining the final results, is there any inequality involved?
 
Last edited:
I got you about (2) [<1/2]
 
Hmm. On reflection, making the connection from the convergence in distribution to a normal, to a conclusion that the set of points ##\omega\in\Omega## for which ##\lim_{j\to\infty}S_j(\omega)=\infty## has zero measure is not as straightforward as I initially thought. I will think more about it when I get a decent slab of free time.
 
  • #10
Isn't this a random walk based, say at 0 , in the Real line? I don't know how to describe convergence in a random walk.
 
  • #11
WWGD said:
Isn't this a random walk based, say at 0 , in the Real line? I don't know how to describe convergence in a random walk.
Yes. That's how I've been thinking about it.

The formal statement of the problem, as I understand it, is a request to prove that $$Pr(\lim_{j\to\infty} S_j=\infty)=0$$

where we interpret ##\lim_{j\to\infty}S_j=\infty## to mean that ##\forall M\in\mathbb{N}\ \exists n\in\mathbb{N}\ ## such that ##j>n\Rightarrow S_j>M##. All the instances of ##S_j## in that statement should really be written ##S_j(\omega)## where ##\omega## is an element of the sample space ##\Omega## of infinite sequences, in order to clarify that the statements are about specific sequences ##(S_j(\omega))_{j\in\mathbb{N}}##rather than about the random variables ##S_j##.

If we label by ##A## the set of all sequences/walks ##S_j## that have that property, which we call 'diverging to infinity' then we are trying to prove that ##Pr(A)=0##.

I'm starting to think that my Normal Approximation suggestion won't help though.

Another possible avenue of attack would be if we could show that ##A##is countable. Then we would know it must have measure zero, since the set ##\Omega## of all sequences is uncountable. We can also think about the sequences as binary expansions of real numbers in [0,1] and I'm wondering if any known results about the measure of subsets of that interval might help,
 
  • #14
Mike.B said:
You mean ##\mathbb{P}(V=\infty)=1##?
Yes, that's what I meant. Could that work, as ##(V=\infty) \cap A =\emptyset## (I think)?
 
  • #15
Samy_A said:
Yes, that's what I meant. Could that work, as ##V \cap A =\emptyset## (I think)?

Define ##B=\{\omega\in \Omega:V(\omega)=\infty\}##? And show ##A\cap B##=0$?
 
  • #16
Mike.B said:
Define ##B=\{\omega\in \Omega:V(\omega)=\infty\}##? And show ##A\cap B##=0$?
If ##\omega\in A## then ##\forall M\in\mathbb{N}\ \exists n\in\mathbb{N}\ ## such that ##\forall j>n\Rightarrow S_j(\omega)>M##.
Take ##M>0##. Doesn't that imply that ##\omega## can return to 0 at most ##n## times, so ##\omega \not\in B##?
 
Last edited:
  • Like
Likes Mike.B and WWGD
  • #17
I think You are right!
 
  • #18
Isn't it also possible to use a recursion in this case , whose solution gives a closed form for the general probability?
 
  • #19
WWGD said:
Isn't it also possible to use a recursion in this case , whose solution gives a closed form for the general probability?
What do you mean by "recursion"?
 
  • #21
WWGD said:
Isn't it also possible to use a recursion in this case , whose solution gives a closed form for the general probability?

Can you tell me more?
 
  • #22
Mike.B said:
Can you tell me more?
Sure, give me some time and I will be back a bit later with more on it.
 
  • #23
Consider the recursion ## P(n,x) =[P(n-1,x-1)+P(n-1, x+1)]/2 , P(0,0)=1 , P(0,x)=0 ; x \neq 0 ## , where ##n## is the step number in the walk and ##P(n,x)## is the probability of being at ##x## at step ##n## on the Real line at step ##n##, and ##x## is any Natural number.
 
Last edited:
  • #24
I tried the link given in post 12 but it was full of broken latex links and hence unusable on the computer I'm at now.
However I found the following http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter12.pdf
It proves a number of results, and in Exercise 12.1 on page 6 it proves that ##w_*=1## where
$$w_*\equiv \lim_{n\to\infty} w_n$$
and ##w_n## is the probability that the random walk has returned to the origin no later than step ##n##.

We can then proceed as follows
[Edit: I discovered subsequently that this argument is flawed. See post 26 below for corrected version]

##A\subseteq B## where ##B## is the set of paths that do not return to the origin infinitely often.

##B\subseteq J_n## where ##J_n## is the set of paths that do not return to the origin in the first ##n## steps. Note that ##Pr(J_n)=1-w_n##.

Hence
$$Pr(A)\leq Pr(B)\leq \inf\{Pr(J_n)\ |\ n\in\mathbb{N}\}\leq \inf\{1-w_n\ |\ n\in\mathbb{N}\}=1-\sup\{w_n\ |\ n\in\mathbb{N}\}=1-w_*=0$$

The proof in the link that ##w_*=1## is far from trivial though, involving a generating function and requiring a proof of an interim result, at the bottom of p4 (attributed to Wilf), which is set as an exercise for the reader in Exercise 1. I have not done that exercise.
 
Last edited:
  • Like
Likes Mike.B
  • #25
andrewkirk said:
##B\subseteq J_n## where ##J_n## is the set of paths that do not return to the origin in the first ##n## steps.
I don't understand this step: how do you know that ##B\subseteq J_n##?
 
  • #26
@Samy_A You're right, that's not valid. Let me fix that.
First, change the definition of ##A## slightly to be the union of the sets of paths that diverge to positive and negative infinity respectively.

##B=\bigcup_{j=0}^\infty C_j## where ##C_j## is the set of paths that are zero at time ##k## and never thereafter. Note that this is a disjoint union.
Then

$$Pr(A)\leq Pr(B)=\sum_{k=0}^\infty Pr(C_k)
=\sum_{k=0}^\infty Pr(C_k)Pr(C_k|D_k)$$

where ##D_k## is the set of paths that is zero at time ##k##.

By the independence of the ##X_j##s, we have that ##\forall k\geq 0\ Pr(C_k)=Pr(C_0)##.
And if, as before we define ##J_k## to be the set of paths that do not return to the origin in the first ##k## steps, then

$$Pr(C_0)=Pr(\bigcup_{k=1}^\infty
J_k)
\leq
\lim_{k\to\infty}Pr(J_k)=\lim_{n\to\infty} (1-w_n)
=1-\lim_{n\to\infty} w_n
=1-w_*
=1-1
=0
$$

So $$Pr(A)\leq
\sum_{k=0}^\infty (Pr(D_k)\cdot 0)=0$$
 
Last edited:
  • Like
Likes Samy_A

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K