- #1
maverick280857
- 1,789
- 4
Hi
I have a question regarding i.i.d. random variables. Suppose [itex]X_1,X_2,\ldots[/itex] is sequence of independent and identically distributed random variables with probability density function [itex]f_{X}(x)[/itex], mean = [itex]\mu[/itex] and variance = [itex]\sigma^2 < \infty[/itex].
Define
[tex]Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}[/tex]
Without knowing the form of [itex]f_{X}[/itex], how does one prove that [itex]var(Y_{n}) = \sigma^2/n[/itex]?
I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.
Thanks.
I have a question regarding i.i.d. random variables. Suppose [itex]X_1,X_2,\ldots[/itex] is sequence of independent and identically distributed random variables with probability density function [itex]f_{X}(x)[/itex], mean = [itex]\mu[/itex] and variance = [itex]\sigma^2 < \infty[/itex].
Define
[tex]Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}[/tex]
Without knowing the form of [itex]f_{X}[/itex], how does one prove that [itex]var(Y_{n}) = \sigma^2/n[/itex]?
I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.
Thanks.