- 4,662
- 372
I have this question which I am puzzled from, hope someone can help me here.
Prove/disprove:
If X[n] is a sequence of independent random variables, s.t for each n E(|X[n]|)<5 (E is the expecatation value) and the stong law doesn't apply to this sequence, then the same law doesn't apply on the sequence Y_n=max(X_n,X_{n+1}).
Any tips on how to solve this question, I think the assertion is correct, not sure how to show it though, I mean: P(|\frac{\sum_{i=0}^{n}X_i}{n}-\frac{\sum_{i=0}^{n}E(X_i)}{n}|>=\epsilon) doesn't converge to zero as n approaches infinity, so I need to show that the same also applies to Y[n], or to show that its complement doesn't converge to 1.
So, P((|\frac{\sum_{i=0}^{n}Y_i}{n}-\frac{\sum_{i=0}^{n}E(Y_i)}{n}|<\epsilon)>=1-\frac{Var(Y_n)}{n^2\epsilon^2}.
Now Var(Y_n)=E(Y^2_n )-E(Y_n)^2, now here I need to use that E(|X[n]|)<5, but not sure how.
Prove/disprove:
If X[n] is a sequence of independent random variables, s.t for each n E(|X[n]|)<5 (E is the expecatation value) and the stong law doesn't apply to this sequence, then the same law doesn't apply on the sequence Y_n=max(X_n,X_{n+1}).
Any tips on how to solve this question, I think the assertion is correct, not sure how to show it though, I mean: P(|\frac{\sum_{i=0}^{n}X_i}{n}-\frac{\sum_{i=0}^{n}E(X_i)}{n}|>=\epsilon) doesn't converge to zero as n approaches infinity, so I need to show that the same also applies to Y[n], or to show that its complement doesn't converge to 1.
So, P((|\frac{\sum_{i=0}^{n}Y_i}{n}-\frac{\sum_{i=0}^{n}E(Y_i)}{n}|<\epsilon)>=1-\frac{Var(Y_n)}{n^2\epsilon^2}.
Now Var(Y_n)=E(Y^2_n )-E(Y_n)^2, now here I need to use that E(|X[n]|)<5, but not sure how.
Last edited: