Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Query regarding Independent and Identically Distributed random variables

  1. Mar 3, 2008 #1
    Hi

    I have a question regarding i.i.d. random variables. Suppose [itex]X_1,X_2,\ldots[/itex] is sequence of independent and identically distributed random variables with probability density function [itex]f_{X}(x)[/itex], mean = [itex]\mu[/itex] and variance = [itex]\sigma^2 < \infty[/itex].

    Define

    [tex]Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}[/tex]

    Without knowing the form of [itex]f_{X}[/itex], how does one prove that [itex]var(Y_{n}) = \sigma^2/n[/itex]?

    I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.

    Thanks.
     
  2. jcsd
  3. Mar 3, 2008 #2

    mathman

    User Avatar
    Science Advisor
    Gold Member

    var(Yn)=E(Yn2)-E(Yn)2

    Plug in the series for Yn and expand, using the fact the E(sum)=sum(E) and E(prod of ind. rv)=prod of E's., it will all work out. Note that all you needed was independence and the fact that the mean and variance was the same for all. The distributions could have been different.
     
  4. Mar 3, 2008 #3
    Thanks mathman :smile:
     
  5. Mar 5, 2008 #4
    Can also be done as follows:

    If [itex]T = \sum_{i=1}^{n}a_{i}X_{i}[/itex] then [itex]Var(T) = \sum_{i=1}^{n}a_{i}^2Var(x_{i})[/itex], which gives

    [tex]Var(Y_{n}) = \sum_{i=1}^{n}\frac{1}{n^2}Var(x_{i}) = \frac{\sigma^2}{n}[/tex]

    Edit: need only the independence of the random variables
     
    Last edited: Mar 5, 2008
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?