Undergrad Bivariate normal distribution from normal linear combination

Click For Summary
The discussion revolves around proving that linear combinations of independent normal random variables are also normally distributed, utilizing moment-generating functions (mgf). The mgf of a normal distribution is presented, demonstrating that the sum of independent normal variables results in a normal distribution with a mean equal to the sum of the individual means and a variance equal to the sum of the individual variances. A participant suggests that the problem may require deriving coefficients for linear combinations based on given means and variances. They successfully find a solution using algebraic methods and a computer algebra system, although they express skepticism about the relevance of matrix properties in this context. The conversation highlights the complexity of proving the proposition and the utility of computational tools in statistical proofs.
fisher garry
Messages
63
Reaction score
1
upload_2019-1-8_17-53-23.png

I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?
 

Attachments

  • upload_2019-1-8_17-53-23.png
    upload_2019-1-8_17-53-23.png
    20.5 KB · Views: 1,044
Physics news on Phys.org
fisher garry said:
View attachment 236979
I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?

I think you are looking at the wrong problem. The description implies that
$$
\begin{array}{rcl}
U&=& a_1 X_1 + a_2 X_2 + \cdots + a_n X_n \\
V&=& b_1 X_1 + b_2 X_2 + \cdots + b_n X_n
\end{array}
$$
Here the ##a_i## and ##b_i## are constants.

You can work out ##EU, EV, \text{Var}(U), \text{Var}(V)## and ##\text{Cov}(U,V)## in terms of the ##a_i, b_i## and the original ##\mu, \sigma## of the ##X_i##. Thus, you have a formula for the means and variance-covariance matrix in terms of ##a_i , b_i, i=1,2,\ldots, n##.

I think that what the question is asking for is the converse: given ##\mu, \sigma##, the means and the variance-covariance matrix of ##U,V##, it wants you to either find appropriate ##a_i, b_i## that will work (that is, give the right ##U,V##), or at least to show that such ##a_i, b_i## exist, even if not easy to find. Using moment-generating functions (as you did above) should be very helpful for this purpose.
 
fisher garry said:
View attachment 236979
I can't prove this proposition. I have however managed to prove that the linear combinations of the independent normal rv's are also normal by looking at it's mgf

$$E(e^{X_1+X_2+...+X_n})=E(e^{X_1})E(e^{X_2})...E(e^{X_n})$$
The mgf of a normal distribution is $$e^{\mu t}e^{\frac{t^2 \sigma^2}{2}}$$
$$E(e^{X_1+X_2+...+X_n})=e^{\mu_1 t}e^{\frac{t^2 \sigma_1^2}{2}}e^{\mu_2 t}e^{\frac{t^2 \sigma_2^2}{2}}...e^{\mu_n t}e^{\frac{t^2 \sigma_n^2}{2}}=e^{(\mu_1+\mu_2+...\mu_n) t}e^{\frac{t^2 (\sigma_1^2+\sigma_2^2+...+\sigma_n^2)}{2}}$$

Which is the normal distribution with mean $$\mu_1+\mu_2+...\mu_n$$ and variance $$\sigma_1^2+\sigma_2^2+...+\sigma_n^2$$

I know about the theory in section 5.4. Some of it is presented here
$$g(y_1,y_2)=f(x_1,x_2)|\frac{\partial(x_1,x_2)}{\partial(y_1,y_2)}|$$Can anyone show how the proof they refer to by section 5.4 and matrix theory goes?

I won't give a lot of details, but will expand a bit on my previous post.

Presumably, if you are given the means ##EU, EV## and the variance-covariance matrix of the pair ##(U,V)##, you are allowed to specify just how to make up ##U,V## in terms of some iid random variables ##X_1, X_2, \ldots, X_n##. That is, we are allowed to choose an ##n## in an attempt to prove the result.

When we are given the above information about ##U## and ##V##, we are given five items of data: two means, two variances, and one covariance. So, we will need at least five "coefficients" altogether.

I tried ##U = a_1 X_1 + a_2 X_2## and ##V = b_1 X_1 + b_2 X_2 + b_3 X_3##. The means, variances and covariance of ##(U,V)## can be expressed algebraically in terms of the ##a_i, b_j## and the underlying ##\mu, \sigma## of the ##X_k.## Thus, we obtain five equations in the five variables ##a_1, a_2, b_1, b_2,b_3##.

I managed to get a solution, thus proving the result asked for; however, it would possibly take many hours (perhaps days) of algebraic work to do it manually, so I saved myself a lot of grief by using the computer algebra package Maple to do the heavy lifting.

I don't see how matrix properties would be useful here, but maybe the person setting the problem had another method in mind.
 
  • Like
Likes fisher garry
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
7K
  • · Replies 6 ·
Replies
6
Views
2K