Sequence of normalized random variables

Click For Summary

Discussion Overview

The discussion revolves around the convergence properties of a sequence of normalized random variables defined as \(Y_i = X_i/E[X_i]\). Participants explore whether this sequence converges to a random variable with mean 1 and under what conditions this might occur, considering various scenarios and examples.

Discussion Character

  • Exploratory
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants question whether the sequence \(Y_1, Y_2, ...\) always converges to a random variable with mean 1, suggesting that additional restrictions on the \(X_i\)s may be necessary.
  • One participant asks what specific restrictions on the \(X_i\)s would allow for convergence, particularly in cases where the \(X_i\)s may approach infinity.
  • Another participant provides an example where the probabilities of \(X_n\) are defined, raising concerns about the behavior of the sequence if the \(X_i\)s "blow up."
  • A participant presents a specific case involving a mixture of normal and uniform distributions for the \(X_i\)s, questioning the convergence of the sequence in this context.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the conditions required for convergence of the sequence \(Y_i\). Multiple competing views and uncertainties regarding the behavior of the sequence under different scenarios remain evident.

Contextual Notes

There is a lack of context or background information provided for the original question, which may limit the understanding of the conditions under which convergence is being considered. Additionally, the discussion highlights the need for further clarification on the behavior of the \(X_i\)s, especially in extreme cases.

batman3
Messages
3
Reaction score
0
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?
 
Physics news on Phys.org
batman said:
Let X_1, X_2, ... be a sequence of random variables and define Y_i = X_i/E[X_i]. Does the sequence Y_1, Y_2, ... always convergence to a random variable with mean 1?

There is something missing from your statement of the problem, the \(Y_i\)s all have mean 1, but there is no reason why they should converge without some further restriction on the \(X\)s.

CB
 
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?
 
batman said:
So what should be the restriction on the X_i's? In particular, can the Y_i's still convergence if the X_i's go to infinity?

It would be better if you provide the context and/or further background for your question.

CB
 
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.
 
Last edited:
what if the X's blow up like

$P(X_n=2^n)=2^{-n}$ and $P(X_n=0)=1-2^{-n}$
 
Last edited:
batman said:
There is no context and/or further background. It was just a question that came to my mind while studying convergence of random variables. I just thought a normalized sequence would always have mean 1 in the limit and I was wondering whether there would be a general condition when it converges.

Suppose \(X_{2k}\sim N(1,1) \) and \( X_{2k-1} \sim U(0,2),\ k=1, 2, ..\) Now does the sequence \(\{X_i\}\) converge (in whatever sense ..) ?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K