Var(X1 + X2 + ) = Var(X_1) + Var(X_2) + ?

  • Thread starter kingwinner
  • Start date
In summary, the question is whether the variance of an infinite series of independent random variables is equal to the sum of their individual variances. This is true for finite series, but the case for infinite series relies on the convergence of the sum on the right, which can be proven using a convergence theorem.
  • #1
kingwinner
1,270
0
Suppose the random variables Xi's are independent,
then is it always true that Var(X_1 + X_2 + X_3 +...) = Var(X_1) + Var(X_2) + Var(X_3)+...?


Note that I'm talking about the case of

Σ X_i
i=1

I'm sure it's true for finite series, but how about infinite series?
I tried searching the internet, but can't find anything...

Any help is appreciated!
 
Physics news on Phys.org
  • #2
kingwinner said:
Suppose the random variables Xi's are independent,
then is it always true that Var(X_1 + X_2 + X_3 +...) = Var(X_1) + Var(X_2) + Var(X_3)+...?
Well, it just comes down to whether [itex]\mathbb{E}\left[Var(X_1 + X_2 +...)\right]=\mathbb{E}\left[X_1\right]+\mathbb{E}\left[X_2\right]+...[/itex], right? Since, by plugging in the definition of variance, you can change the variance of the sum into the expectation of a sum. And that comes down to whether you're allowed to move a limit (since that's what the infinite sum is, formally) out of a Lebesque integral. For that you need a convergence theorem, e.g. dominated convergence or something like that, which is going to place some limits on the Xi, but they'll not be too severe. I think (I'm too lazy to try to prove this), that if the sum on the right converges, that's probably enough to prove the convergence of the integral on the left to the same value.
 

1. What is the meaning of the equation Var(X1 + X2 + ) = Var(X_1) + Var(X_2) + ?

The equation represents the variance of a sum of random variables (X1, X2, etc.) being equal to the sum of variances of each individual random variable.

2. How is this equation derived?

This equation is derived from the properties of variance, specifically the property that the variance of a sum of random variables is equal to the sum of their individual variances.

3. Does this equation hold for all types of random variables?

Yes, this equation holds for all types of random variables, as long as they are independent and have finite variances.

4. Can this equation be extended to more than two random variables?

Yes, this equation can be extended to any number of random variables, as long as they are independent and have finite variances.

5. How is this equation useful in statistical analysis?

This equation is useful in statistical analysis because it allows us to calculate the variance of a sum of random variables by simply summing the variances of each individual random variable, rather than having to calculate the variance of the entire sum. This simplifies the calculation and makes it easier to analyze data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
9
Views
787
  • Precalculus Mathematics Homework Help
Replies
10
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
8K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
20K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
5K
Back
Top