Var(X1 + X2 + ) = Var(X_1) + Var(X_2) + ?

  • Context: Graduate 
  • Thread starter Thread starter kingwinner
  • Start date Start date
Click For Summary
SUMMARY

The discussion centers on the variance of the sum of independent random variables, specifically addressing the equation Var(X1 + X2 + X3 + ...) = Var(X1) + Var(X2) + Var(X3) + ... for infinite series. It is established that this holds true under the condition that the sum converges, relying on convergence theorems such as the Dominated Convergence Theorem. The conversation emphasizes the necessity of understanding how to manipulate limits within Lebesgue integrals to validate the variance equation for infinite sums.

PREREQUISITES
  • Understanding of variance in probability theory
  • Familiarity with independent random variables
  • Knowledge of Lebesgue integration
  • Concepts of convergence theorems, particularly Dominated Convergence Theorem
NEXT STEPS
  • Study the Dominated Convergence Theorem in detail
  • Learn about Lebesgue integration and its applications in probability
  • Explore the properties of variance for infinite series of random variables
  • Investigate convergence criteria for series of random variables
USEFUL FOR

Mathematicians, statisticians, and data scientists who are working with probability theory, particularly those dealing with the properties of random variables and their sums.

kingwinner
Messages
1,266
Reaction score
0
Suppose the random variables Xi's are independent,
then is it always true that Var(X_1 + X_2 + X_3 +...) = Var(X_1) + Var(X_2) + Var(X_3)+...?


Note that I'm talking about the case of

Σ X_i
i=1

I'm sure it's true for finite series, but how about infinite series?
I tried searching the internet, but can't find anything...

Any help is appreciated!
 
Physics news on Phys.org
kingwinner said:
Suppose the random variables Xi's are independent,
then is it always true that Var(X_1 + X_2 + X_3 +...) = Var(X_1) + Var(X_2) + Var(X_3)+...?
Well, it just comes down to whether \mathbb{E}\left[Var(X_1 + X_2 +...)\right]=\mathbb{E}\left[X_1\right]+\mathbb{E}\left[X_2\right]+..., right? Since, by plugging in the definition of variance, you can change the variance of the sum into the expectation of a sum. And that comes down to whether you're allowed to move a limit (since that's what the infinite sum is, formally) out of a Lebesque integral. For that you need a convergence theorem, e.g. dominated convergence or something like that, which is going to place some limits on the Xi, but they'll not be too severe. I think (I'm too lazy to try to prove this), that if the sum on the right converges, that's probably enough to prove the convergence of the integral on the left to the same value.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
9K
  • · Replies 7 ·
Replies
7
Views
21K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 40 ·
2
Replies
40
Views
3K
Replies
3
Views
5K