1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Showing a Distribution is Gaussian

  1. Feb 6, 2016 #1
    1. The problem statement, all variables and given/known data

    Consider f = wT x, where p(w) ∼ N (w|0, Σ). Show that p(f|x) is Gaussian.

    3. The attempt at a solution

    So there are apparently two approaches to this problem using either the linearity of f in terms of w or moment generating functions. But I'm having a lot of trouble figuring out how to proceed. I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this. Any help is appreciated.

    Edit: we are allowed to assume that the variables of w (w1, ...wd) are independent.
     
    Last edited: Feb 6, 2016
  2. jcsd
  3. Feb 7, 2016 #2

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    There's info missing. Do you intend w and x to be vector random variables of the same dimension?
    What do you mean by p(w) ∼ N (w|0, Σ)?
    Did you mean w ∼ N (0, Σ)? (which is standard notation for saying that the vector random variable w is distributed as a Normal with zero mean and correlation matrix Σ).

    When two normal RVs are correlated, you can show the sum is normal using mgfs, but it's longer as you can't factorise.
    The mgf of ##X_1+X_2## where ##X_1,X_2## are zero-mean normals with correlation ##\rho## is

    $$E[e^{t(X_1+X_2)}]=\int_{-\infty}^\infty\int_{-\infty}^\infty e^{t(x_1+x_2)} \phi(x_1,x_2,\sigma_1,\sigma_2,\rho)\,dx_1\,dx_2$$

    ##\phi## is the joint Gaussian pdf. Since it's exponential in form, you should be able to combine it with the other bit, do some completing the square and simplifying and before you know it, you'll have the mgf of a univariate Gaussian.
    Having done that case, you just use the fact that shifting the mean of a Gaussian leaves it as still Gaussian to generalise this to Gaussians with nonzero means.
     
    Last edited: Feb 7, 2016
  4. Feb 7, 2016 #3

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    No, you are not allowed to assume that ##w_1, w_2, \ldots, w_n## are independent; it is supposed that their variance-covariance matrix ##\Sigma## is given as input data. Making them independent would destroy the power of the conclusion.

    The problem is reasonably straightforward it you use a moment-generating function approach; that is, if ##f = x_1 w_1 + x_2 w_2 + \cdots + x_n w_n## and if ##M = \sigma^{-1}## is the inverse matrix of ##\sigma##, you want to show that its moment-generating function
    [tex] M(t) = E \exp\left(t \sum_i x_i w_i \right) = \frac{\sqrt{\det(M)}}{(2 \pi)^{n/2}} \int_{R^n} \exp\left(-\frac{1}{2} w^T M w + t \sum_i x_i w_i \right) \, d^n w [/tex]
    takes the simple form
    [tex] M(t) = e^{\frac{1}{2} \sigma^2 t^2} [/tex]
    for some positive constant ##\sigma^2##.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Showing a Distribution is Gaussian
Loading...