Showing a Distribution is Gaussian

Click For Summary
SUMMARY

The discussion focuses on demonstrating that the distribution of the function f = wT x, where p(w) ∼ N(w|0, Σ), is Gaussian. Participants suggest two approaches: utilizing the linearity of f in terms of w or employing moment-generating functions (MGFs). The consensus is that the MGF method is effective, particularly when considering the sum of independent normal distributions, and that the correlation between variables must be accounted for using the variance-covariance matrix Σ. The final goal is to express the MGF in a simplified form that confirms the Gaussian nature of p(f|x).

PREREQUISITES
  • Understanding of Gaussian distributions and properties
  • Familiarity with moment-generating functions (MGFs)
  • Knowledge of variance-covariance matrices in statistics
  • Basic linear algebra concepts related to vector random variables
NEXT STEPS
  • Study the properties of moment-generating functions in detail
  • Learn about the implications of variance-covariance matrices in multivariate distributions
  • Explore the derivation of the MGF for sums of correlated normal random variables
  • Investigate applications of Gaussian distributions in statistical modeling
USEFUL FOR

Students and professionals in statistics, data science, and machine learning who are working with Gaussian distributions and need to understand the implications of linear transformations on these distributions.

NATURE.M
Messages
298
Reaction score
0

Homework Statement



Consider f = wT x, where p(w) ∼ N (w|0, Σ). Show that p(f|x) is Gaussian.

The Attempt at a Solution


[/B]
So there are apparently two approaches to this problem using either the linearity of f in terms of w or moment generating functions. But I'm having a lot of trouble figuring out how to proceed. I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this. Any help is appreciated.

Edit: we are allowed to assume that the variables of w (w1, ...wd) are independent.
 
Last edited:
Physics news on Phys.org
There's info missing. Do you intend w and x to be vector random variables of the same dimension?
What do you mean by p(w) ∼ N (w|0, Σ)?
Did you mean w ∼ N (0, Σ)? (which is standard notation for saying that the vector random variable w is distributed as a Normal with zero mean and correlation matrix Σ).

I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this
When two normal RVs are correlated, you can show the sum is normal using mgfs, but it's longer as you can't factorise.
The mgf of ##X_1+X_2## where ##X_1,X_2## are zero-mean normals with correlation ##\rho## is

$$E[e^{t(X_1+X_2)}]=\int_{-\infty}^\infty\int_{-\infty}^\infty e^{t(x_1+x_2)} \phi(x_1,x_2,\sigma_1,\sigma_2,\rho)\,dx_1\,dx_2$$

##\phi## is the joint Gaussian pdf. Since it's exponential in form, you should be able to combine it with the other bit, do some completing the square and simplifying and before you know it, you'll have the mgf of a univariate Gaussian.
Having done that case, you just use the fact that shifting the mean of a Gaussian leaves it as still Gaussian to generalise this to Gaussians with nonzero means.
 
Last edited:
NATURE.M said:

Homework Statement



Consider f = wT x, where p(w) ∼ N (w|0, Σ). Show that p(f|x) is Gaussian.

The Attempt at a Solution


[/B]
So there are apparently two approaches to this problem using either the linearity of f in terms of w or moment generating functions. But I'm having a lot of trouble figuring out how to proceed. I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this. Any help is appreciated.

Edit: we are allowed to assume that the variables of w (w1, ...wd) are independent.

No, you are not allowed to assume that ##w_1, w_2, \ldots, w_n## are independent; it is supposed that their variance-covariance matrix ##\Sigma## is given as input data. Making them independent would destroy the power of the conclusion.

The problem is reasonably straightforward it you use a moment-generating function approach; that is, if ##f = x_1 w_1 + x_2 w_2 + \cdots + x_n w_n## and if ##M = \sigma^{-1}## is the inverse matrix of ##\sigma##, you want to show that its moment-generating function
M(t) = E \exp\left(t \sum_i x_i w_i \right) = \frac{\sqrt{\det(M)}}{(2 \pi)^{n/2}} \int_{R^n} \exp\left(-\frac{1}{2} w^T M w + t \sum_i x_i w_i \right) \, d^n w
takes the simple form
M(t) = e^{\frac{1}{2} \sigma^2 t^2}
for some positive constant ##\sigma^2##.
 

Similar threads

Replies
7
Views
2K
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K