Showing a Distribution is Gaussian

Click For Summary
The discussion centers on demonstrating that the distribution p(f|x) is Gaussian, given f = wT x and p(w) ∼ N(w|0, Σ). Participants suggest two main approaches: using the linearity of f in terms of w or employing moment-generating functions (mgfs). There is a clarification that the independence of w's components cannot be assumed due to the provided variance-covariance matrix Σ, which is crucial for the analysis. The moment-generating function method is emphasized as effective, particularly in showing that the sum of independent normal distributions remains normal. Ultimately, the goal is to express the moment-generating function in a simplified form that confirms the Gaussian nature of p(f|x).
NATURE.M
Messages
298
Reaction score
0

Homework Statement



Consider f = wT x, where p(w) ∼ N (w|0, Σ). Show that p(f|x) is Gaussian.

The Attempt at a Solution


[/B]
So there are apparently two approaches to this problem using either the linearity of f in terms of w or moment generating functions. But I'm having a lot of trouble figuring out how to proceed. I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this. Any help is appreciated.

Edit: we are allowed to assume that the variables of w (w1, ...wd) are independent.
 
Last edited:
Physics news on Phys.org
There's info missing. Do you intend w and x to be vector random variables of the same dimension?
What do you mean by p(w) ∼ N (w|0, Σ)?
Did you mean w ∼ N (0, Σ)? (which is standard notation for saying that the vector random variable w is distributed as a Normal with zero mean and correlation matrix Σ).

I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this
When two normal RVs are correlated, you can show the sum is normal using mgfs, but it's longer as you can't factorise.
The mgf of ##X_1+X_2## where ##X_1,X_2## are zero-mean normals with correlation ##\rho## is

$$E[e^{t(X_1+X_2)}]=\int_{-\infty}^\infty\int_{-\infty}^\infty e^{t(x_1+x_2)} \phi(x_1,x_2,\sigma_1,\sigma_2,\rho)\,dx_1\,dx_2$$

##\phi## is the joint Gaussian pdf. Since it's exponential in form, you should be able to combine it with the other bit, do some completing the square and simplifying and before you know it, you'll have the mgf of a univariate Gaussian.
Having done that case, you just use the fact that shifting the mean of a Gaussian leaves it as still Gaussian to generalise this to Gaussians with nonzero means.
 
Last edited:
NATURE.M said:

Homework Statement



Consider f = wT x, where p(w) ∼ N (w|0, Σ). Show that p(f|x) is Gaussian.

The Attempt at a Solution


[/B]
So there are apparently two approaches to this problem using either the linearity of f in terms of w or moment generating functions. But I'm having a lot of trouble figuring out how to proceed. I can see the we can use the moment generating function to show that the sum of two independent normal distributions is also a normal distribution (i.e since the sum can be written as a product of the mgf's). But I'm a bit stumbled by this. Any help is appreciated.

Edit: we are allowed to assume that the variables of w (w1, ...wd) are independent.

No, you are not allowed to assume that ##w_1, w_2, \ldots, w_n## are independent; it is supposed that their variance-covariance matrix ##\Sigma## is given as input data. Making them independent would destroy the power of the conclusion.

The problem is reasonably straightforward it you use a moment-generating function approach; that is, if ##f = x_1 w_1 + x_2 w_2 + \cdots + x_n w_n## and if ##M = \sigma^{-1}## is the inverse matrix of ##\sigma##, you want to show that its moment-generating function
M(t) = E \exp\left(t \sum_i x_i w_i \right) = \frac{\sqrt{\det(M)}}{(2 \pi)^{n/2}} \int_{R^n} \exp\left(-\frac{1}{2} w^T M w + t \sum_i x_i w_i \right) \, d^n w
takes the simple form
M(t) = e^{\frac{1}{2} \sigma^2 t^2}
for some positive constant ##\sigma^2##.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
7
Views
2K
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K