1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Proving a normal distribution

  1. Aug 22, 2010 #1
    Z=(-1/sqrt(n)) * sum from k=1 to n of [1+log(1-Fk)]

    Fk is a cumulative distribution function which is continious and strictly increasing.

    Show that as n->infinity, Z converges to a normal distribution with mean 0 and var 1


    From Taylor series, log(1-x) = -sum from 1 to infinity of (x^n)/n but dont see how this can help at the moment
    Ive been looking for anything around the summation of c.d.fs but havent found anything so I think Im unaware of a few theorems which are essential to solving this. Any help appreciated. Been working on this for hours with no success.
  2. jcsd
  3. Aug 22, 2010 #2


    User Avatar
    Homework Helper

    can you exlpain what you mean by Fk? and how it depends on k?

    I understand its a cumulative distribution function, continuous & strictly increasing...

    however I may be missing something as I can't see where the randomness is coming into Z - otherwise I'd be thinking something long the lines of central limit theorem...
    Last edited: Aug 22, 2010
  4. Aug 22, 2010 #3
    Sorry, didnt make that clear. Xk is a sequence of independent random variables and Fk is its associated cumulative distribution function.

    So Fk is a sequence of cumulative distribution functions taking on values form 0 to 1. Can we simply look at it like this? And hence assume it is equivalent to a sequence of independent random variables with values 0 to 1?
  5. Aug 22, 2010 #4
    Some identities that are useful:

    Expected value of X = integration from 0 to inf of [1-F(t)] dt where F(t) is the cdf
    I need to obtain exp(-x^2/2) from the sum and log. Integrating x will obtain x^2/2 and taking the anti-log will obtain the exponential.

    Not sure how all this fits together though
  6. Aug 23, 2010 #5
    After looking at this more, it seems the moment generating function approach is the way to go.
    By obtaining the mgf and finding the mean and variance to be 0 and 1, we prove it is a normal distribution.
    Does anyone know if this is acceptable?
  7. Aug 23, 2010 #6


    User Avatar
    Homework Helper

    i still don't understand your explanation of Fk

    is there only a single cdf F(x)?

    so for every Xk has the same cdf Fi(x) = P(Xi<=x)?

    if so, then the map between Xk & Fk is one to one monotonic and the random varable Fk, becomes a uniform random variable between 0 and 1
  8. Aug 23, 2010 #7
    No, there is a cdf for each x.

    Fk(xk) for k=1 to n
    F1(x1), F2(x2) etc

    Sorry, it's the lack of latex that makes it hard to show subscript
  9. Aug 23, 2010 #8
    I looked at taking the mgf thinking if I can show the mean to be 0 and variance 1 from this approach and hence prove normality

    g(t) = 1(1/sqrt(n) sum(from 1 to inf) exp(tk) [1+log(1-Fk(Xk))]

    but this approach goes nowhere
  10. Aug 23, 2010 #9


    User Avatar
    Homework Helper

    even if they're different you should be able to show any Fk(x=Xk) represents a uniform random variable on [0,1], by definition of the cdf - have you tried using that property?
  11. Aug 23, 2010 #10
    As n tends to infinity, the cumulative distribution function Fk(Xk) tends to F(X)

    I think this is ok. I don't see how to show this tends to a normal distribution though.

    I think I'm making this much harder than it is. Don't see anything close in any textbook or on the web!
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook