batman3
- 3
- 0
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
The discussion revolves around the convergence properties of a sequence of normalized random variables defined as \(Y_i = X_i/E[X_i]\). Participants explore whether this sequence converges to a random variable with mean 1 and under what conditions this might occur, considering various scenarios and examples.
Participants do not reach a consensus on the conditions required for convergence of the sequence \(Y_i\). Multiple competing views and uncertainties regarding the behavior of the sequence under different scenarios remain evident.
There is a lack of context or background information provided for the original question, which may limit the understanding of the conditions under which convergence is being considered. Additionally, the discussion highlights the need for further clarification on the behavior of the \(X_i\)s, especially in extreme cases.
batman said:Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
batman said:So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?
batman said:There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.