Sequence of normalized random variables

Click For Summary
SUMMARY

The discussion centers on the convergence of a sequence of normalized random variables defined as \(Y_i = X_i/E[X_i]\). While each \(Y_i\) has a mean of 1, convergence to a random variable with mean 1 is not guaranteed without additional restrictions on the sequence \(X_i\). Participants highlight that the behavior of \(X_i\) as it approaches infinity can affect convergence, necessitating specific conditions for \(X_i\) to ensure the convergence of \(Y_i\). Examples provided include sequences where \(X_n\) follows a distribution that may lead to divergence.

PREREQUISITES
  • Understanding of random variables and their properties
  • Knowledge of expected value and its calculation
  • Familiarity with convergence concepts in probability theory
  • Basic statistics, including distributions such as Normal and Uniform
NEXT STEPS
  • Research conditions for convergence of random variables in probability theory
  • Study the implications of normalization on random variable sequences
  • Explore the behavior of sequences of random variables under different distributions
  • Learn about the Central Limit Theorem and its relation to convergence
USEFUL FOR

Mathematicians, statisticians, and students studying probability theory, particularly those interested in the convergence of random variables and their applications in statistical analysis.

batman3
Messages
3
Reaction score
0
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
 
Physics news on Phys.org
batman said:
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?

There is something missing from your statement of the problem, the \(Y_i\)s all have mean 1, but there is no reason why they should converge without some further restriction on the \(X\)s.

CB
 
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?
 
batman said:
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?

It would be better if you provide the context and/or further background for your question.

CB
 
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.
 
Last edited:
what if the X's blow up like

$P(X_n=2^n)=2^{-n}$ and $P(X_n=0)=1-2^{-n}$
 
Last edited:
batman said:
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.

Suppose \(X_{2k}\sim N(1,1) \) and \( X_{2k-1} \sim U(0,2),\ k=1, 2, ..\) Now does the sequence \(\{X_i\}\) converge (in whatever sense ..) ?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K