Variance of a summation of Gaussians

  • Thread starter Thread starter SeriousNoob
  • Start date Start date
  • Tags Tags
    Summation Variance
Click For Summary
SUMMARY

The discussion focuses on the variance of a summation of Gaussian random variables, specifically the equation var(1/N ∑ w[n]) = (1/N²) ∑ var(w[n]), where w[n] is a Gaussian random variable with mean 0 and variance 1. Key principles utilized include the properties of variance for scaled and uncorrelated random variables. The participant confirms that the mean remains 0 due to the nature of Gaussian summation and acknowledges the straightforward proofs of the variance properties involved.

PREREQUISITES
  • Understanding of Gaussian random variables
  • Familiarity with variance properties
  • Knowledge of probability theory
  • Basic mathematical proof techniques
NEXT STEPS
  • Study the properties of variance in detail, particularly for random variables
  • Learn about the Central Limit Theorem and its implications for Gaussian distributions
  • Explore the concept of uncorrelated random variables and their variance
  • Review mathematical proofs related to variance and expectation in probability
USEFUL FOR

Students and professionals in statistics, data science, or any field involving probability theory, particularly those working with Gaussian distributions and variance calculations.

SeriousNoob
Messages
12
Reaction score
0

Homework Statement


I am trying to follow a step in the textbook but I don't understand.

var\left(\frac{1}{N}\sum_{n=0}^{N-1}w[n]\right)\\<br /> =\frac{1}{N^2}\sum_{n=0}^{N-1}var(w[n])
where w[n] is a Gaussian random variable with mean = 0 and variance = 1

Homework Equations



Var(X) = \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2.<br />

The Attempt at a Solution


The mean is 0 because a summation of Gaussian is Gaussian.
But squaring the whole expression doesn't seem right as there seems to be a trick used to go from line 1 to 2.
 
Last edited:
Physics news on Phys.org
Two facts are being used here:

1. If X is a random variable and c is a constant, then \text{var}(cX) = c^2\text{var}(X).
2. If X and Y are uncorrelated random variables, then \text{var}(X + Y) = \text{var}(X) + \text{var}(Y). From this, it's an easy induction to handle the sum of N uncorrelated random variables.

Both of these facts are straightforward to prove and should be found in any probability book.
 
Thanks a lot. Haven't touched random variables for a while and the summation threw me off.

The proofs for those facts are indeed very straightforward.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
0
Views
1K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K