Variance of the sum of random independent variables

In summary: ZCBwb3J0YWwgb2YgdGhlIGJpbGxpbmcgY29udGVudF9vZiB0aGUgeF9pJiBzZWNyZXQsIGFuZCBsZXQgeSB7IF9qID0gXFxzdW1tIHgKXG5cblRoZSB2YXJpYW5jZSBvZiB0aGUgcmFuZG9tIGRpc3RyaWJ1dGlvbiBvZiB0aGUgeF9pJydzIGlzIGtub3duLCBhbmQ
  • #1
dipole
555
151

Homework Statement



let [itex]x_{i}[/itex] be a random variable, and let [itex] y_{j} = \sum x_{i}[/itex].

The variance of the random distribution of the [itex]x_{i}'s[/itex] is known, and each y is the sum of an equal amount of [itex]x_{i}'s[/itex], say N of them.

How do I compute the variance of y in terms of [itex] \sigma^2_{x} [/itex] and N?

Homework Equations



[itex] \sigma^2_{y} = \sum\frac{(y - \mu_{y})^2}{M} [/itex]
 
Physics news on Phys.org
  • #2
Also forgot to mention that I already know that [itex] \mu_{y} = N\mu_{x}[/itex].
 
  • #3
dipole said:

Homework Statement



let [itex]x_{i}[/itex] be a random variable, and let [itex] y_{j} = \sum x_{i}[/itex].

The variance of the random distribution of the [itex]x_{i}'s[/itex] is known, and each y is the sum of an equal amount of [itex]x_{i}'s[/itex], say N of them.

How do I compute the variance of y in terms of [itex] \sigma^2_{x} [/itex] and N?

Homework Equations



[itex] \sigma^2_{y} = \sum\frac{(y - \mu_{y})^2}{M} [/itex]

I don't understand your formula for [itex] \sigma^2_{y}[/itex], which would be false for every probability distribution I can think of. (It assumes Y is a uniformly-distributed random variable taking M distinct values.)

Start with the simple case N=2: [itex]Y = X_1 + X_2,[/itex] where [itex] X_1,\; X_2[/itex] are independent. Once you have done that case, the general case follows almost immediately.

RGV
 

FAQ: Variance of the sum of random independent variables

1. What is the definition of variance of the sum of random independent variables?

The variance of the sum of random independent variables is a measure of how much the values of the sum of two or more random variables vary from the expected value. It is a measure of the spread of the distribution of the sum of the variables.

2. How is the variance of the sum of random independent variables calculated?

The variance of the sum of random independent variables is calculated by taking the sum of the variances of the individual variables and adding twice the covariance between each pair of variables. The covariance measures the linear relationship between two variables and how they change together.

3. Why is the variance of the sum of random independent variables important?

The variance of the sum of random independent variables is important because it allows us to understand the variability of the combined effect of multiple variables. This is particularly useful in fields such as statistics, economics, and engineering, where the combined effect of multiple variables often needs to be analyzed.

4. Can the variance of the sum of random independent variables ever be negative?

No, the variance of the sum of random independent variables can never be negative. Variance is a measure of the spread of a distribution, so it cannot be negative. It is always a non-negative value.

5. How does the central limit theorem relate to the variance of the sum of random independent variables?

The central limit theorem states that the sum of a large number of independent random variables will tend towards a normal distribution. This means that as the number of variables increases, the variance of the sum will decrease, ultimately approaching a constant value. This is why the variance of the sum of random independent variables is often used in statistical analysis to approximate the distribution of a large number of variables.

Back
Top