MHB Sequence of normalized random variables

AI Thread Summary
The discussion centers on the convergence of a sequence of normalized random variables defined as Y_i = X_i/E[X_i], questioning whether this sequence always converges to a random variable with mean 1. Participants note that while the Y_i's have a mean of 1, convergence is not guaranteed without additional restrictions on the X_i's. A specific example is provided where X_i's can diverge, raising concerns about the conditions necessary for convergence. The conversation also touches on different distributions for X_i's, such as normal and uniform, to explore their convergence behavior. Overall, the thread emphasizes the need for further conditions to ensure the convergence of the Y_i sequence.
batman3
Messages
3
Reaction score
0
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
 
Physics news on Phys.org
batman said:
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?

There is something missing from your statement of the problem, the \(Y_i\)s all have mean 1, but there is no reason why they should converge without some further restriction on the \(X\)s.

CB
 
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?
 
batman said:
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?

It would be better if you provide the context and/or further background for your question.

CB
 
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.
 
Last edited:
what if the X's blow up like

$P(X_n=2^n)=2^{-n}$ and $P(X_n=0)=1-2^{-n}$
 
Last edited:
batman said:
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.

Suppose \(X_{2k}\sim N(1,1) \) and \( X_{2k-1} \sim U(0,2),\ k=1, 2, ..\) Now does the sequence \(\{X_i\}\) converge (in whatever sense ..) ?
 
Back
Top