1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Shaping probability distribution function

  1. Sep 20, 2012 #1
    1. The problem statement, all variables and given/known data

    Incoming signal has normal distribution, xmin is equal to -sigma, xman is equal to +sigma. What is the governing equation of the nonlinearity through which the signal has to be passed in order to make its pdf uniform?

    2. Relevant equations

    http://en.wikipedia.org/wiki/Normal_distribution

    3. The attempt at a solution

    I have already found out that the signal needs to be passed through erf(x/sqrt(2)), which is very relevant to the CDF of normal distribution. The problem is that I cannot find a mathematical proof.
     
  2. jcsd
  3. Sep 21, 2012 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    It's very easy. Say you have a continuous random variable X with a strictly increasing cumulative distribution F(x) on an x-interval [a,b] (possibly a = -∞ and b = +∞). The probability density of X is f(x) = (d/dx) F(x). Now look at Y = F(X); that is, for each observation x of X we let the observation of Y be y = F(x). What is the distribution of Y? For x < X < x + dx the probability is f(x)*dx, so if x <--> x and y + dy <--> x + dx, we have
    P{y < Y < y + dy} = f(x)*dx. If g(y) is the probability density of Y, we therefore have g(y)*dy = f(x)*dx. But dy/dx = (d/dx) F(x) = f(x), so dy = f(x) dx, hence we must have g(y) = 1; that is, Y is uniform on (0,1).

    All this is very standard in Monte-Carlo simulation, where it is used to generate samples from non-uniform distributions: we generate Y uniform on (0,1), then obtain our sample of X from x = F-1(y) (at least, in those cases where the latter function is known and not too hard to compute).

    RGV
     
  4. Sep 21, 2012 #3
    RGV, thanks for your feedback. I believe I have already proved it using a non-linearly distributed substitute variable "a" as opposed to the linear variable "x". Those two variables could be unambiguously mapped from one to the other to warp the "x" axis to ultimately render the pdf uniform.

    I dare to say I understand your reasoning here, I will just need a little bit of time to absorb it.

    The integral of (pdf):

    [itex]
    f(x) = \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x}{\sigma})^2}
    [/itex]

    is (cdf):

    [itex]
    \frac{1}{2}erf(\frac{x}{\sqrt{2}\sigma})
    [/itex]

    Which is a normalized function with zero mean. I assume the mapping function needs to be multiplied by [itex]\sigma[/itex] to achieve the desired uniform mapping. This is just a detail.

    Quite interesting info about Monte-Carlo simulations. It would make sense to transform a linear distribution to resemble other kinds of distributions. I wonder whether MatLab does it the same way.
     
  5. Sep 21, 2012 #4

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I'm not sure how Matlab does it, but a common way to generate Gaussian (normal) random numbers from uniform ones is the following trick: if [itex]u_1[/itex] and [itex]u_2[/itex] are independent random variables, uniformly distributed over (0,1], then
    [tex]n_1 = \sqrt{-2 \log(u_1)} \cos(2\pi u_2)[/tex]
    and
    [tex]n_2 = \sqrt{-2 \log(u_1)} \sin(2\pi u_2)[/tex]
    are independent Gaussian random variables with zero mean and unit variance. This is the so-called Box-Muller transformation:

    http://en.wikipedia.org/wiki/Box–Muller_transform
     
  6. Sep 24, 2012 #5
    jbuniniii,

    Thanks for the info about the transform. That's why I love this forum. One question brings together many ideas and perspectives.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Shaping probability distribution function
Loading...