Why Does Yₙ Converge to Zero for q < 1/2 in a Random Walk?

Click For Summary

Discussion Overview

The discussion centers on the convergence of the sequence \( Y_n = e^{S_n} \) in a random walk defined by independent, identically distributed random variables \( X_j \) with probabilities \( P(X_j = 1) = q \) and \( P(X_j = -1) = 1 - q \). Participants explore why \( Y_n \) converges to zero for \( q < \frac{1}{2} \) and the implications for other values of \( q \), including \( q = \frac{1}{2} \) and \( q = 1 \). The conversation includes technical reasoning, mathematical arguments, and challenges regarding the assumptions made in the analysis.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant argues that using the law of large numbers, \( \log Y_n \) converges almost surely to \( -\infty \) when \( q < \frac{1}{2} \), implying \( Y_n \) converges to 0 almost surely.
  • Another participant suggests that \( Y_n \) being a martingale leads to the conclusion that if it converges almost surely, it must converge to a constant, but challenges this by showing that \( Y_{n+1}/Y_n \) does not converge almost surely.
  • Concerns are raised about the implications when \( q = 1/2 \), where \( S_n \) has a mean of zero, leading to \( Y_n \) converging to 1 almost surely.
  • Participants express doubts about the correctness of the initial analysis, particularly regarding the case when \( q = 1 \), where \( Y_n \) diverges to infinity.
  • There is a mention of a specific case where \( q = \frac{1}{e+1} \) and its relevance to the martingale property of \( Y_n \).

Areas of Agreement / Disagreement

Participants do not reach a consensus, as there are multiple competing views regarding the behavior of \( Y_n \) for different values of \( q \). Some participants agree on the implications for \( q < \frac{1}{2} \), while others raise concerns about the analysis for \( q = 1 \) and the overall correctness of the claims made.

Contextual Notes

Participants highlight potential limitations in the analysis, particularly regarding the assumptions made about the values of \( q \) and the implications for convergence. The discussion remains open-ended with unresolved mathematical steps and conditions.

WMDhamnekar
MHB
Messages
378
Reaction score
30
Let ##X_1, X_2, \dots ##be independent, identically distributed random variables with

##P{X_j = 1} = q, P{X_j = −1} = 1 − q.##

Let ##S_0 = 0 ##and for n ≥ 1, ##S_n = X_1 + X_2 + \dots + X_n.## Let ##Y_n = e^{S_n}##
Let ##Y_{\infty} = \lim\limits_{n\to\infty} Y_n.## Explain why ##Y_{\infty} = 0.## (Hint: there are at least two ways to show this. One is to consider ##\log{Y_n}## and use the law of large numbers. Another is to note that with probability one ##Y_{n+1}/Y_n## does not converge.)

My answer:

We can use the law of large numbers to show that ##\log Y_n## converges almost surely to ##-\infty##, which implies that ##Y_n## converges almost surely to 0.
First, note that ##Y_n## is a non-negative random variable, so its logarithm ##\log Y_n## is well-defined. Using the definition of ##Y_n##, we have:
##\log{Y_n}= S_n =\displaystyle\sum_{j=1}^n X_j##

By the law of large numbers, we know that the sample mean $\frac{1}{n}\sum_{j=1}^{n}X_j$ converges almost surely to the expected value ##E[X_j] = q - (1-q) = 2q-1.## Therefore, we have:
##\frac{1}{n}\displaystyle\sum_{j=1}^n X_j \to (2q-1)## almost surely

Using this result, we can show that ##\log{Y_n}## converges almost surely to ##-\infty## as follows:
####\begin{align*}
\log Y_n &= \sum_{j=1}^{n} X_j \ \
&= n\cdot\frac{1}{n}\sum_{j=1}^{n} X_j \ \
&\to n(2q-1) \quad \text{almost surely} \ \
&= \infty \quad \text{if } q > \frac{1}{2} \ \
&= -\infty \quad \text{if } q < \frac{1}{2}
\end{align*}####
When ##q < \frac{1}{2}##, we have ##\log{Y_n} \to -\infty## almost surely, which implies that ##Y_{\infty} = 0## almost surely.
Alternatively, we can use the fact that ##Y_n## is a martingale to show that ##Y_n## does not converge almost surely. By definition, ##Y_n## is a martingale because ##E[Y_{n+1} \mid Y_1,\ldots,Y_n] = E[e^{S_{n+1}} \mid Y_1,\ldots,Y_n] = e^{S_n} = Y_n##. Using the martingale convergence theorem, we know that if ##Y_n## converges almost surely to a limit ##Y_{\infty}##, then ##Y_{\infty}## must be a constant because ##Y_n## is a martingale. However, we can show that ##Y_{n+1}/Y_n## does not converge almost surely, which implies that ##Y_n## does not converge almost surely.
To see why ##Y_{n+1}/Y_n## does not converge almost surely, note that:
####\begin{align*}\displaystyle\frac{Y_{n+1}}{Y_n} &= \displaystyle\frac{e^{S_{n+1}}}{e^{S_n}} \ \
&= e^{S_{n+1}-S_n} \ \
&= e^{X_{n+1}}

\end{align*}####
Since ##X_{n+1}## takes on the values 1 and -1 with positive probability, we have ##e^{X_{n+1}} = e## or ##1/e## with positive probability. Therefore, ##Y_{n+1}/Y_n## does not converge almost surely, which implies that ##Y_n ##does not converge almost surely.If q = 1/2, what would be your answer?

If ##q = 1/2##, then ##X_j## is a symmetric random variable with ##E[X_j] = 0##. In this case, ##S_n## is a random walk with mean zero, which means that ##\log{Y_n} = S_n## also has mean zero. By the law of large numbers, we know that ##\frac{1}{n}\cdot S_n## converges almost surely to zero, which implies that ##S_n## converges with probability one to zero. Using the continuous mapping theorem, we have ##\log{Y_n} \to 0## with probability one, which implies that ## Y_n \to 1##with probability one. Therefore, ##Y_n## converges to 1 with probability one when ##q = 1/2##.

Is the above answer correct?
 
Last edited:
Physics news on Phys.org
Any reason it shouldn't be ?

Where's my magnifier ?

##\ ##
 
BvU said:
Any reason it shouldn't be ?

Where's my magnifier ?

##\ ##
Please open images in new tabs and zoom them in 125% or larger than 125%. No need for magnifier.:smile:
 
WMDhamnekar said:
Please open images in new tabs and zoom them in 125% or larger than 125%. No need for magnifier.:smile:
DId just that and full is my screen with just the first few lines. But already doubt creeps in:
Any limitations on ##q## ? If not, for ##q=1## disaster looms ...

Or did my magnifier disrupt something ?

##\ ##
 
  • Like
  • Informative
Likes   Reactions: WMDhamnekar and FactChecker
BvU said:
DId just that and full is my screen with just the first few lines. But already doubt creeps in:
Any limitations on ##q## ? If not, for ##q=1## disaster looms ...

Or did my magnifier disrupt something ?

##\ ##
I have written the answer in LaTeX form now. :smile:
 
WMDhamnekar said:
I have written the answer in LaTeX form now. :smile:
But the question is lost ... and I think there's somethimg very wrong with it ...

1679744096360-png.png
 
BvU said:
But the question is lost ... and I think there's somethimg very wrong with it ...

View attachment 324043
I have written question also in LaTeX form now.
 
You need to address the issue that @BvU brought up in post #4. It's hard to prove something that is incorrect. When ##q=1##, ##S_n = n## and ##Y_n = e^n \rightarrow \infty##.
 
  • Like
Likes   Reactions: WMDhamnekar
FactChecker said:
You need to address the issue that @BvU brought up in post #4. It's hard to prove something that is incorrect. When ##q=1##, ##S_n = n## and ##Y_n = e^n \rightarrow \infty##.
Sorry, I forgot to provide one additional information. For ##q =\displaystyle\frac{1}{e+1}, Y_n## is a martingale.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
Replies
6
Views
2K
Replies
1
Views
2K