maverick280857
- 1,774
- 5
Hi
I have a question regarding i.i.d. random variables. Suppose X_1,X_2,\ldots is sequence of independent and identically distributed random variables with probability density function f_{X}(x), mean = \mu and variance = \sigma^2 < \infty.
Define
Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}
Without knowing the form of f_{X}, how does one prove that var(Y_{n}) = \sigma^2/n?
I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.
Thanks.
I have a question regarding i.i.d. random variables. Suppose X_1,X_2,\ldots is sequence of independent and identically distributed random variables with probability density function f_{X}(x), mean = \mu and variance = \sigma^2 < \infty.
Define
Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}
Without knowing the form of f_{X}, how does one prove that var(Y_{n}) = \sigma^2/n?
I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.
Thanks.