Sum independent normal variables

  • I
  • Thread starter mathman
  • Start date
  • #1
mathman
Science Advisor
7,932
484
Summary:
The sum of normally distributed random variables is normal
(I know how to prove it). Prove that a finite sum of of independent normal random variables is normal. I suspect that independence may not be necessary.
 

Answers and Replies

  • #2
Dale
Mentor
Insights Author
2020 Award
31,200
7,887
Independence is necessary. Suppose, for example that ##X_1 \sim \mathcal{N}(\mu=0,\sigma=1)## and ##X_2 = -X_1## then ##X_2 \sim \mathcal{N}(\mu=0,\sigma=1)## but ##X_1+X_2=0## which is not normally distributed.
 
  • #3
mathman
Science Advisor
7,932
484
Independence is necessary. Suppose, for example that ##X_1 \sim \mathcal{N}(\mu=0,\sigma=1)## and ##X_2 = -X_1## then ##X_2 \sim \mathcal{N}(\mu=0,\sigma=1)## but ##X_1+X_2=0## which is not normally distributed.
I was wondering about random variables which are correlated, but with##|\rho|\lt 1## or in general as long as they are linearly in dependent. In that case they could be transformed into a set of independent random variables.
 
  • #4
StoneTemplePython
Science Advisor
Gold Member
1,186
583
your original claim is true iff they are jointly normally distributed. This is implied by the standard claim that an random n-vector is a multivariate gaussian iff every 1-dimensional marginal one is a normal r.v.

It depends on pedantry, but Dale's counterexample actually isn't one-- it is just a case of a "normal random variable" with mean zero and variance of zero. Not particularly satisfying but it is useful as a vacuous truth.
 
  • #6
WWGD
Science Advisor
Gold Member
5,487
3,889
Iirc ,you can use generating functions for the proof.
 
  • #7
BWV
824
848
Iirc ,you can use generating functions for the proof.
A proof w/ MGFs would have the same form the characteristic function proof in the link
 
  • #8
Stephen Tashi
Science Advisor
7,642
1,495
Are their theorems that deal with the general question? - Given a set of random variables ##\{X_1,X_2,..,X_M\}##, when does their exist some subspace ##S## of the vector space of random variables such that ##S## contains each ##X_i## and ##S## has a basis of mutually independent random variables?

I was wondering about random variables which are correlated, but with##|\rho|\lt 1## or in general as long as they are linearly in dependent.

"Linearly independent" presumably implies we are considering a set of random variables to be a vector space under the operations of multiplication by scalars and addition of the random variables.

In that case they could be transformed into a set of independent random variables.

Suppose we take "transformed" to mean transformed by a linear transformation in a vector space. If the vector space containing the random variables ##{X_1,X_2,...X_M}## has a finite basis ##B_1,B_2,...B_n## consisting of mutually independent random variables then (trivially) for each ##X_k## there exists a possibly non-invertible linear transformation ##T## that transforms some linear combination of the ##B_i## into ##X_k##.

If the smallest vector space containing the ##X_i## is infinite dimensional (e.g the vector space of all measurable functions on the real number line) , I don't know what happens.

I don't recall any texts that focus on vector spaces of random variables. Since the product of random variables is also a random variable, the topic for textbooks seems to be the algebra of random variables. But that approach downplays the concept of probability distributions.
 
  • #9
mathman
Science Advisor
7,932
484
by the CLT, a finite sum of random variables from any distribution with finite variance will converge to normal

On the sum of 2 normals, some proofs in wikipedia

https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
My original question was specifically for finite sums. The Wiki article gives the answer for a pair of correlated variables which then implies it holds for any finite sum.
 
  • #10
BWV
824
848
The Wiki article gives the answer for a pair of correlated variables which then implies it holds for any finite sum.
Yes, if that did not work, then all of Modern Portfolio Theory would fail
 
  • #11
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
4,415
503
Are their theorems that deal with the general question? - Given a set of random variables ##\{X_1,X_2,..,X_M\}##, when does their exist some subspace ##S## of the vector space of random variables such that ##S## contains each ##X_i## and ##S## has a basis of mutually independent random variables?

I'm not sure this is the right question. Given a single random variable, it plus itself is just 2 time itself, since you're not adding two independent copies of itself. So any one dimensional subspace for example.


If you are wondering about sets of random variables that are stable under adding independent copies, then I think it's just the normal random variables, since if you repeatedly add a random variable to itself you get something that slowly deforms into a gaussian. I guess the space of all things deforming into gaussians also works.
 
  • #12
WWGD
Science Advisor
Gold Member
5,487
3,889
I'm not sure this is the right question. Given a single random variable, it plus itself is just 2 time itself, since you're not adding two independent copies of itself. So any one dimensional subspace for example.


If you are wondering about sets of random variables that are stable under adding independent copies, then I think it's just the normal random variables, since if you repeatedly add a random variable to itself you get something that slowly deforms into a gaussian. I guess the space of all things deforming into gaussians also works.
Don't mean to go far OT, but I am kind of curious about transformations that preserve normality.

An interesting thing is that covariance defines an inner product on the space of random variables.
 
  • #13
BWV
824
848
Don't mean to go far OT, but I am kind of curious about transformations that preserve normality.

An interesting thing is that covariance defines an inner product on the space of random variables.
Wouldn't any linear or affine transformation of a normal RV preserve normality?
 
  • #14
Stephen Tashi
Science Advisor
7,642
1,495
The Wiki article gives the answer for a pair of correlated variables

Yes, the article gives an answer for a pair of correlated normal random variables that have have a joint bivariate normal distribution, but not for more general case of correlated normal random variables whose joint distribution is not specified.
 
  • #15
BWV
824
848
Yes, the article gives an answer for a pair of correlated normal random variables that have have a joint bivariate normal distribution, but not for more general case of correlated normal random variables whose joint distribution is not specified.
So if you have the distribution of each component and the covariance, dont you also have the joint distribution?
 
  • #17
Stephen Tashi
Science Advisor
7,642
1,495
Wouldn't any linear or affine transformation of a normal RV preserve normality?

I'd say yes,

The topic
but I am kind of curious about transformations that preserve normality.
could include all transformations that preserve normality so perhaps it should be specialized to particular kinds of transformation to eliminate relatively trivial ones. For example if ##X## has a normal distribution with mean zero and we define the transformation ##T(x) = x## except for x = 3 or -3, where we define T(x) = -x, have we have transformed ##X## to a (technically) different normally distributed random variable?
 
  • #18
Stephen Tashi
Science Advisor
7,642
1,495
I'm not sure this is the right question. Given a single random variable, it plus itself is just 2 time itself, since you're not adding two independent copies of itself. So any one dimensional subspace for example.
The question I propose is whether each ##X_i## is an element of the same subspace ##S##. So the fact that each ##X_i## can be regarded as being in the 1 dimensional subspace generated by itself, doesn't answer that.
 
  • #19
mathman
Science Advisor
7,932
484
It appears that a joint distribution of two dependent normal variables may not be normal. However it is not clear to me whether or not the sum has a normal distribution.
 
  • #20
WWGD
Science Advisor
Gold Member
5,487
3,889
It appears that a joint distribution of two dependent normal variables may not be normal. However it is not clear to me whether or not the sum has a normal distribution.
Just use , e.g., for X~N(0,1), the sum X+(-X).
 
  • #21
mathman
Science Advisor
7,932
484
Just use , e.g., for X~N(0,1), the sum X+(-X).
This is an artificial special case where the variance = 0. It could be called a normal distribution.
 
  • #23
BWV
824
848
Seems bogus to me, as η contains a discontinuous function σ , and is only 'Normal' because the Bernoulli distribution is hidden until you add it to ξ. I think if you wrote out the characteristic functions for the example like in the proof here: https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables

, it would be obvious as you would have the composition of Bernoulli CF of
1616620111790.png
in the gaussian integral and they are really not two normal distributions
 
  • #24
WWGD
Science Advisor
Gold Member
5,487
3,889
It seems we can set the Copula in such a way that the sum violates normality. I will give it a try, should be a nice exercise. Never tried it before, but should work.
 
Last edited:
  • Like
Likes Stephen Tashi
  • #25
Stephen Tashi
Science Advisor
7,642
1,495
Seems bogus to me, as η contains a discontinuous function σ

How would we state a theorem that avoids that counterexample?

If we say "Let ##\eta## be a normally distributed random variable that does not contain a discontinuous function", what would that mean?

On attempt to define "##\eta## does not contain a discontinuous function" might be: There doesn not exist a finite set of random variables ##\{X_1, X_2,...,X_n\}## such that at least one of the ##X_i## does not have a continuous distribution and such that for some function ##f## we have ##\eta = f(X_1,X_2,....X_n)##. However, given such great freedom for people to choose ##f## and the ##X_i## we might eliminate all normally distributed ##\eta## from consideration.

In practical situations the evidence about how ##\eta## can be represented as functions of other random variables would come from joint measurements of ##\eta## and those random variables or functions of those random variables. So perhaps the definition would need to be of the form "##\eta## does not contain a discontinuous function with respect to the random variable ##X## means that ......".
 

Related Threads on Sum independent normal variables

Replies
10
Views
1K
Replies
2
Views
3K
Replies
10
Views
4K
Replies
3
Views
4K
Replies
1
Views
2K
Replies
9
Views
3K
Replies
1
Views
2K
Top