About the linear combination of multivariate normal distributions.

  • Thread starter mangdoo
  • Start date
  • #1
5
0
How can I prove that the any linear combination of multivariate normal
distribution is also normal?

I can prove it but I'm not sure that this is right or not. The point of my
proof is as follows.

---
The X and Y has the same dimensional random vector, and each random vector is
multivariate normal distribution. If I compute the linear combination of X
and Y,

Z = aX+bY

I can compute the components of random vector Z.

Z = [aX_1+bY_1; aX_2+bY_2; ..... ;aX_n+bY_n]

In order to be a multivariate normal distribution, each component in random
vector should be a univariate normal distribution. In this point of view,
each component in random vector Z is a linear combination of univariate
normal distribution.(Because X and Y are multivariate normal distributions.)
Since the linear combination of univariate normal distribution is normal,
each component in random vector Z is also normal. Therefore Z, a linear
combination of multivariate normal distribution, is also normal.
----

My proof is right? If not, please tell me the wrong points. If right, then
how can I calculate the mean and covariance of the linear combination of
multivariate normal distribution.

Your comments are very helpful for me.
Thank you very much.
 

Answers and Replies

  • #2
statdad
Homework Helper
1,495
35
You can clean up your wording this way: For any [tex] X, Y [/tex] which are multivariate normal vectors we know that every linear combination of their components is univariate normal. Set [tex] Z = aX + bY[/tex], where [tex] a, b [/tex] are real: you want to show [tex] Z [/tex] is multivariate normal. Argue that since any linear combination of [tex] Z's [/tex] components is a linear combination of those of [tex] X, Y [/tex] the result holds.

OR

You could use the idea of multivariate moment-generating functions (use characteristic functions if you wish).

The rules for finding the mean vector and covariance matrix for [tex] aX + bY[/tex]
can be found in textbooks. They are not much different than the rules for scalar random quantities.
 
  • #3
Stephen Tashi
Science Advisor
7,453
1,404
each component in random vector Z is also normal. Therefore Z, a linear
combination of multivariate normal distribution, is also normal.
The wikipedia article on the multivariate normal distribution says that it is possible to have a distribution where the X and Y components are normal and the distribution of (X,Y) is not bivariate normal. So what needs to be examined is your assertion that the normality of all the components proves the multivariate normality of the joint distribution.
 
  • #4
5
0
Thank you very much, statdad and Stephen Tashi. :)

Okay. Then how about this idea?

First, in order to prove the linear combination of dependent univariate normal distribution, I write the two random variables in this way.

[tex]X = a_{11}u_{1} + a_{12}u_{2} + \mu_{x}[/tex]
[tex]Y = a_{21}u_{1} + a_{22}u_{2} + \mu_{y}[/tex]

[tex]U_{1}[/tex] and [tex]U_{2}[/tex] are independent standard normal distributions.

So, if I do the linear combination of [tex]X[/tex] and [tex]Y[/tex], then the equation is like this

[tex] X+Y = (a_{11} + a_{21})u_{1} + (a_{12}+a_{22})u_{2} + \mu_{x} + \mu_{y}[/tex]

Using the characteristic function, I can easily prove the linear combination of dependent univariate normal distribution is also normal.


Now, I want to expand this into the multivariate normal vector.

[tex] X = AU + \mu_{x} [/tex]
[tex] Y = BU + \mu_{y} [/tex]

[tex] X[/tex], [tex]Y[/tex], [tex]\mu_{x} [/tex] and [tex] \mu_{y}[/tex] are d dimensional vector. [tex] A [/tex] and [tex] B [/tex] are the d by d matrix. [tex] U [/tex] is d dimensional vector which components are independent univariate standard normal.

So, if [tex] X [/tex] and [tex] Y [/tex] are combinated linearly, the equation is,

[tex] X+Y = (A+B)U + \mu_{x} + \mu_{y} = Z[/tex]

Then I can show that the linear combination of the components in [tex] Z [/tex] is normal.

Is this right? I feel something missed.
 
  • #5
Stephen Tashi
Science Advisor
7,453
1,404
Thank you very much, statdad and Stephen Tashi. :)

Okay. Then how about this idea?

First, in order to prove the linear combination of dependent univariate normal distribution, I write the two random variables in this way.
I think you mean "independent univariate normal random variables".


[tex]X = a_{11}u_{1} + a_{12}u_{2} + \mu_{x}[/tex]
[tex]Y = a_{21}u_{1} + a_{22}u_{2} + \mu_{y}[/tex]

[tex]U_{1}[/tex] and [tex]U_{2}[/tex] are independent standard normal distributions.
I think you mean that [tex]U_{1}[/tex] and [tex]U_{2}[/tex] are independent random variables with standard normal distributions.

So, if I do the linear combination of [tex]X[/tex] and [tex]Y[/tex], then the equation is like this

[tex] X+Y = (a_{11} + a_{21})u_{1} + (a_{12}+a_{22})u_{2} + \mu_{x} + \mu_{y}[/tex]

Using the characteristic function, I can easily prove the linear combination of dependent univariate normal distribution is also normal.
Again, you mean "independent univariate normal random variables".

Now, I want to expand this into the multivariate normal vector.

[tex] X = AU + \mu_{x} [/tex]
[tex] Y = BU + \mu_{y} [/tex]
Are you are using [tex] X [/tex] and [tex] Y [/tex] to mean something different now?

[tex] X[/tex], [tex]Y[/tex], [tex]\mu_{x} [/tex] and [tex] \mu_{y}[/tex] are d dimensional vector. [tex] A [/tex] and [tex] B [/tex] are the d by d matrix. [tex] U [/tex] is d dimensional vector which components are independent univariate standard normal.
Are you asserting that all multivariate distributions have the above form? If so, is that from a definition or from a theorem?


So, if [tex] X [/tex] and [tex] Y [/tex] are combinated linearly, the equation is,

[tex] X+Y = (A+B)U + \mu_{x} + \mu_{y} = Z[/tex]

Then I can show that the linear combination of the components in [tex] Z [/tex] is normal.
Do you mean that any linear combination of the components of [tex] Z [/tex] is a univariate normal or do you mean that it is a multivariate normal?
 
  • #6
5
0
I think you mean that [tex]U_{1}[/tex] and [tex]U_{2}[/tex] are independent random variables with standard normal distributions.
I'm sorry. The expression [tex] U_{1} [/tex] and [tex] U_{2} [/tex] is wrong. I want to represent the two dependent random variable [tex] X [/tex] and [tex] Y [/tex] using a standard normal distribution [tex] u_{1} [/tex] and [tex] u_{2} [/tex].

Are you are using [tex] X [/tex] and [tex] Y [/tex] to mean something different now?

Are you asserting that all multivariate distributions have the above form? If so, is that from a definition or from a theorem?
I'm sorry again. Now, the [tex] X [/tex] and [tex] Y [/tex] are multivariate normal distributions which dimension is d. So, I have to express each of them using multivariate standard normal distribution [tex] U [/tex] which dimension is also d. [tex] A[/tex] and [tex] B [/tex] are the matrix for covariance for [tex] X [/tex] and [tex] Y [/tex]. Is this expression is wrong? If so, how can I represent the multivariate normal distribution using the multivariate standard distribution?

Do you mean that any linear combination of the components of [tex] Z [/tex] is a univariate normal or do you mean that it is a multivariate normal?
The problem is that the result of the linear combination of multivariate normal distribution, [tex] Z [/tex], can be also multivariate normal distribution or not. It does not mean the univariate normal distribution.
 
  • #7
Stephen Tashi
Science Advisor
7,453
1,404
Let me ask you this: Are you assuming that [tex]X[/tex] is an [tex] n [/tex] dimensional multivariate random variable if and only if [tex] X [/tex] can be written as:

[tex]X = \sum_{i=1}^{i=n} a_i u_i + \mu [/tex]

where the [tex] a_i [/tex] are mutually orthogonal unit vectors, the [tex] u_i [/tex] are scalars that are independent normally distributed random variables with mean zero, and [tex] \mu [/tex] is a constant n-dimensional vector ?

To me, this is plausible. However, you need to cite some reason for it being true. In the articles that I have looked at, the above statement is not the definition of a multivariate random variable. So if you wish to use that statement as an equivalence to the definition, you need to cite some theorem.
 
  • #8
5
0
[tex]X = \sum_{i=1}^{i=n} a_i u_i + \mu [/tex]
The way to represent the multivariate normal distribution like that comes from the book named 'Mathematical statistics'(Peter J.Bickel, Kjell A. Doksum). At the chapter B,(It may be the appendix.) they represent the bivariate normal distribution like this.

[tex]
X = a_{11}u_{1} + a_{12}u_{2} + \mu_{x}
[/tex]
[tex]
Y = a_{21}u_{1} + a_{22}u_{2} + \mu_{y}
[/tex]

So, I just expand this form to the multivariate normal distribution. I would like to read some articles you said, because I'm not confident of my representation and proof.

In addition, if you prove this problem which means that the linear combination of multivariate normal distribution is also normal, how do you approach the proof?

Thanks for your splendid help.
 
  • #9
Stephen Tashi
Science Advisor
7,453
1,404
So, I just expand this form to the multivariate normal distribution. I would like to read some articles you said, because I'm not confident of my representation and proof.
I think you are correct that it generalizes to case of the multivariate normal distribution. I will look in some books for a theorem. The current Wikipedia article on the multivariate normal distribution http://en.wikipedia.org/wiki/Multivariate_normal_distribution has a section called "Drawing values from the distribution". The algorithm implies that a multivariate normal is equivalent to such a sum.

How would I prove it? I like the idea in your method, so I would pursue that. I don't claim to have written a proof of it myself.
 
  • #10
5
0
How would I prove it? I like the idea in your method, so I would pursue that. I don't claim to have written a proof of it myself.
Thank you very much. :)
 
  • #11
2,123
79
In general a multivariate normal distribution should satisfy the following function for the normalized Mahalanobis squared distance function:

[tex]f(x)=\frac{1}{(2\pi)^{p/2}}|\sum|^{1/2} e^{-1/2(x-\mu)' \sum^{-1}(x-\mu)[/tex]

where x and [tex]\mu[/tex] are vectors and [tex]|\sum|[/tex] is a positive definite matrix.

http://www.stat.lsu.edu/faculty/moser/exst7037/mvnprop.pdf [Broken]
 
Last edited by a moderator:
  • #12
Stephen Tashi
Science Advisor
7,453
1,404
I found this passage in "Statistical Pattern Recognition" by Funkanaga: p17

"Also, it is always possible to find a nonsingular linear transformation which makes the new covariance matrix diagonal. Since a diagonal covariance matrix means uncorrelated variables (independent variables for a normal distribution) , we can always find for a normal distribution a set of axes such that random variables are independent in the new coordinate system. These subjects will be discussed in detail in a later section."

He uses "normal distribution" to mean multivariate normal distribution. Unfortunately, I haven't found the "later section" where anything further is discussed.
 

Related Threads on About the linear combination of multivariate normal distributions.

Replies
2
Views
751
Replies
8
Views
919
Replies
5
Views
7K
Replies
4
Views
9K
Replies
6
Views
2K
Replies
6
Views
1K
Replies
1
Views
516
  • Last Post
Replies
1
Views
8K
Top