1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Derivation of Gaussian Distribution

  1. Jul 1, 2008 #1
    1. The problem statement, all variables and given/known data
    Derive the equation for the Gaussian distribution.

    2. Relevant equations

    The probability density function for the Gaussian distribution:
    [tex]f(x) = \frac{1}{\sigma \sqrt{2\pi} } e^{ -\frac{(x-\mu)^2}{2\sigma ^2} }[/tex]

    3. The attempt at a solution

    It is my understanding that the Gaussian distribution is derived from the binomial distribution by means of the central limit theorem. The bionomial distribution is given by the following:

    [tex]f(x) = \frac{n!p^{x}(1-p)^{n-x}}{x!(n-x)!}[/tex]

    If this is true, I don't know how to go about doing it.
     
  2. jcsd
  3. Jul 1, 2008 #2

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    It helps to know the relation between the 'x' in the gaussian, and the 'x' in the binomial distribution. Just to keep things clear put k instead of x in the binomial distribution. Then k=x*sqrt(n*p*(p-1))+np (just x times standard deviation plus mean). Now your binomial distribution is in terms of n and x. Let n go to infinity. I did this by taking the log of the binomial distribution and using Stirling's approximation to replace the factorials. I hope this is just a curiosity question. I did this once on a 5hr plane flight to pass the time - and I think it took me most of the flight to get everything to come out. It wasn't much fun. Maybe there's a cleverer way. But it does work.
     
  4. Jul 2, 2008 #3
    okay cool.

    What does the quantity 'k' represent? This is not actually a HW problem, I'm just trying to make a little more sense of the world.
     
  5. Jul 2, 2008 #4

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    k is the number of trials out of n that 'succeeded', i.e. the k in the combinatorial part C(n,k). What you originally had as 'x' in the binomial distribution. I just suggested you replace it because having two potentially different 'x' around didn't seem like a good idea.
     
  6. Jul 2, 2008 #5
    That's an impressive derivation. How did you manage to find the limit as n was increased to infinity? I have trouble with a lot of complicated functions.
     
  7. Jul 2, 2008 #6

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    It breaks up into a pile of simple functions which to have to rearranged in such a way as to get a finite limit. No individual part is all that bad, it's just complicated. Wanna try it?
     
  8. Jul 2, 2008 #7
    I'm not sure. It looks like a very tedious process with lots of opportunities for typographical errors to occur.
     
  9. Jul 2, 2008 #8

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    Exactly.
     
  10. Jun 29, 2010 #9
    I'm not sure if this will help for intuition behind deriving the Gaussian Distribution, but here's the reason why it follows from the binomial distribution:

    Think of a collection of an infinite number of particles distributed throughout space. The Gaussian Distribution can be thought of as randomly choosing n particles out of the infinitely many. What we are interested in (for now) is the expected total thermal energy in the system of n particles chosen randomly, or at least the distribution of chance that a collection of n particles will have a particular thermal energy.
    After doing this a few times, we find the average thermal energy for a collection of "n" particles to be E, which we can now take to be constant (think of this stage as being something like an induction hypothesis: we can use this to determine how the energy is distributed later).

    We can now look at how that total energy E is distributed within the system of n particles:

    create a "histogram" of possible thermal energies for the n particles, with bin width of "e". The histogram sorts particles within the system of n particles: each "bin" bi will contain ni particles, so that [tex]\sum[/tex]ni = n.

    You can imagine that when you give a certain total energy to the system of n particles, we can (with a few assumptions) create a finite number of energy distributions within the n particles that allow the system to have energy E.

    Count unique configurations for every allowable energy value (k*e) to get the binomial distribution as a function of energy level. (for the "0" state, all the particles are in the lowest bin: this is 1 configuration; for the "1" state, one particle is in the 1st energy level while the rest are in the lowest bin: this gives n configurations; etc.).

    Here are variables to consider:

    The average energy density of particles around the n particles chosen (this gives you the amount of energy you have to work with within the system of n particles).

    The (classical) fact that you can choose an energy partition e(E), so that there will be an equal number of particles as there are energy "bins" (you then proceed to take the limit as n and e go to infinity and 0 respectively: with the preceding assumption, e = E/n, where E is the "total energy" of the system given by the average energy density) .
     
    Last edited: Jun 29, 2010
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Derivation of Gaussian Distribution
Loading...