Undergrad Sum independent normal variables

Click For Summary
SUMMARY

The discussion confirms that a finite sum of independent normal random variables is normally distributed, emphasizing that independence is a necessary condition. Counterexamples, such as the case where one variable is the negative of another, illustrate that dependent variables do not guarantee a normal distribution. The Central Limit Theorem (CLT) is referenced, asserting that sums of random variables from any distribution with finite variance converge to a normal distribution. The conversation also touches on transformations that preserve normality and the implications of joint distributions among correlated normal variables.

PREREQUISITES
  • Understanding of normal distributions, specifically the properties of normal random variables.
  • Familiarity with the Central Limit Theorem (CLT) and its implications for sums of random variables.
  • Knowledge of linear transformations and their effects on probability distributions.
  • Basic concepts of covariance and its role in defining relationships between random variables.
NEXT STEPS
  • Study the Central Limit Theorem in detail, focusing on its applications in statistics.
  • Explore generating functions and moment-generating functions (MGFs) for proofs related to normal distributions.
  • Investigate the properties of joint distributions of correlated normal random variables.
  • Research transformations that preserve normality and their mathematical implications.
USEFUL FOR

Statisticians, data scientists, and mathematicians interested in the behavior of sums of random variables, particularly in the context of normal distributions and their applications in statistical modeling.

  • #31
BWV said:
but you have to define normal by some generating function.

A normal random variable ##\eta## can , in many different ways, be represented as the sum of two other independent normal random variables.

I don't understand how you want to restrict the types of representations of ##\eta## that are permitted.
 
Physics news on Phys.org
  • #32
  • #33
mathman said:
Added note: These random variables are uncorrelated.
Is that correct? The intension of the η=σξ example is to randomly flip the sign on half of ξ, which does not change the distribution, but half of η is perfectly negatively correlated to ξ, and this information is recovered in P(ξ+η=0)=P(σ=−1)=1/2.

ISTM there are problems with the construction on this, but they are above my pay grade. ξ is a function on ℝ, while σ is discrete. How would you actually flip the sign randomly on half of ℝ? any countably finite number in σ would not change the outcome of the distribution, and if σ in uncountably infinite, then how is P(ξ+η=0) not 0?
 
  • #34
BWV said:
How would you actually flip the sign randomly on half of ℝ?
Why would that be necessary? If we grant that you can take a random sample from ##\xi##, in order to realize a sample of ##\xi \sigma## you only need one to decide whether to flip the value of that particular random sample by using one realization of ##\sigma##.

The probability of any particular realization of a sample from a normally distributed random variable is zero, so if zero probability events are going to be a conceptual problem, they are already present in the idea of taking a sample from ##\xi##.
 
  • #35
BWV said:
Is that correct? The intension of the η=σξ example is to randomly flip the sign on half of ξ, which does not change the distribution, but half of η is perfectly negatively correlated to ξ, and this information is recovered in P(ξ+η=0)=P(σ=−1)=1/2.

ISTM there are problems with the construction on this, but they are above my pay grade. ξ is a function on ℝ, while σ is discrete. How would you actually flip the sign randomly on half of ℝ? any countably finite number in σ would not change the outcome of the distribution, and if σ in uncountably infinite, then how is P(ξ+η=0) not 0?
Covariance ##E(\xi\eta)=E(\sigma\xi^2)=E(\sigma)E(\xi^2)=0##, since ##E(\sigma)=0## while ##\sigma## is independent of ##\xi## and both means ##=0##.
 
  • #36
mathman said:
Covariance ##E(\xi\eta)=E(\sigma\xi^2)=E(\sigma)E(\xi^2)=0##, since ##E(\sigma)=0## while ##\sigma## is independent of ##\xi## and both means ##=0##.
So this is a good example to show that even uncorrelated is not enough to guarantee that the sum is normal. It requires independence.
 
  • #37
FactChecker said:
So this is a good example to show that even uncorrelated is not enough to guarantee that the sum is normal. It requires independence.
Independence is not necessary If the variables are jointly normal, even if correlated, the sum will be normal.
 
  • Like
Likes FactChecker
  • #38
mathman said:
Independence is not necessary If the variables are jointly normal, even if correlated, the sum will be normal.
I stand corrected. I should have just said that uncorrelated is not enough.
 
  • #39
Stephen Tashi said:
Are their theorems that deal with the general question? - Given a set of random variables ##\{X_1,X_2,..,X_M\}##, when does their exist some subspace ##S## of the vector space of random variables such that ##S## contains each ##X_i## and ##S## has a basis of mutually independent random variables?
"Linearly independent" presumably implies we are considering a set of random variables to be a vector space under the operations of multiplication by scalars and addition of the random variables.
Suppose we take "transformed" to mean transformed by a linear transformation in a vector space. If the vector space containing the random variables ##{X_1,X_2,...X_M}## has a finite basis ##B_1,B_2,...B_n## consisting of mutually independent random variables then (trivially) for each ##X_k## there exists a possibly non-invertible linear transformation ##T## that transforms some linear combination of the ##B_i## into ##X_k##.

If the smallest vector space containing the ##X_i## is infinite dimensional (e.g the vector space of all measurable functions on the real number line) , I don't know what happens.

I don't recall any texts that focus on vector spaces of random variables. Since the product of random variables is also a random variable, the topic for textbooks seems to be the algebra of random variables. But that approach downplays the concept of probability distributions.
Would be interesting to see if there is something similar to "Decouple" a basis for such space, so that any two ##X_i, X_j## are independent but the span is preserved.
 
  • #40
WWGD said:
Would be interesting to see if there is something similar to "Decouple" a basis for such space, so that any two ##X_i, X_j## are independent but the span is preserved.
That would require a change of the independent variables to a different set.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K