batman3
- 3
- 0
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
The discussion centers on the convergence of a sequence of normalized random variables defined as \(Y_i = X_i/E[X_i]\). While each \(Y_i\) has a mean of 1, convergence to a random variable with mean 1 is not guaranteed without additional restrictions on the sequence \(X_i\). Participants highlight that the behavior of \(X_i\) as it approaches infinity can affect convergence, necessitating specific conditions for \(X_i\) to ensure the convergence of \(Y_i\). Examples provided include sequences where \(X_n\) follows a distribution that may lead to divergence.
PREREQUISITESMathematicians, statisticians, and students studying probability theory, particularly those interested in the convergence of random variables and their applications in statistical analysis.
batman said:Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
batman said:So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?
batman said:There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.