Independent random varables with common expectation and variance

Click For Summary
SUMMARY

The variance of the sum of independent random variables, Sn = X1 + X2 + ... + Xn, each with a common expectation μ and variance σ², is calculated as Var[Sn] = nσ². This conclusion is derived from the properties of independent random variables, where the covariance between any two distinct variables is zero. The discussion emphasizes the importance of understanding the relationship between expectation, variance, and independence in probability theory.

PREREQUISITES
  • Understanding of independent random variables
  • Knowledge of expectation and variance in probability theory
  • Familiarity with covariance and its implications
  • Basic algebraic manipulation of equations
NEXT STEPS
  • Study the properties of independent random variables in depth
  • Learn about the Central Limit Theorem and its applications
  • Explore the concept of covariance and its role in statistics
  • Investigate the implications of variance in real-world scenarios
USEFUL FOR

Students and professionals in statistics, data science, and mathematics who are looking to deepen their understanding of random variables, their properties, and their applications in statistical analysis.

HotMintea
Messages
42
Reaction score
0
Homework Statement

Suppose X1 , X2 , . . . , Xn are independent random variables, with common expectation μ and variance σ^2 . Let Sn = X1 + X2 + · · · + Xn . Find the variance of Sn.

The attempt at a solution

Expected value:

E[S_n] = n E[X_i] = n\mu \hspace{10 cm} (1)

Variance:

Var[S_n] = E[S_n^2] - E[S_n]^2 = E[S_n^2] - n^2 \mu^2 \hspace{7 cm} (2) # Substituted (1).

\displaystyle E[S_n^2] = E[\sum_{i=1}^n X_i^2] + 2 E[\sum_{j=1}^n\sum_{k\ >\ j}^n X_jX_k] = n E[X_i^2] + n(n - 1) E[X_jX_k] \hspace{1 cm} (3) # Expanded Sn.

Var[ X_i ] = E[X_i^2] + E[X_i]^2 = \sigma^2\ \rightarrow\ E[X_i^2] = \sigma^2 + \mu^2 \hspace{5 cm} (4)

\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I'm stuck here.

If I knew the covariance of Xj and Xk, then I could use the following formula:

Covar[X_j, X_k] = E[X_j X_k] - E[X_j]E[X_k]

\rightarrow\ E[X_j X_k] = Covar[X_j, X_k] + E[X_j] E[X_k] = Covar[X_j, X_k] + \mu^2 \hspace{1 cm} (6)

I suspect that "independent random variables with common expectation and variance" implies a certain relation that is necessary for this question.

Can someone give me a hint please?
 
Last edited:
Physics news on Phys.org
Do independent random variables have any covariance?
 
HotMintea said:
\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I meant E[S_n^2] =.

obafgkmrns said:
Do independent random variables have any covariance?

I found the proof of E[X]E[Y] = E[XY], if X and Y are random variables, which basically uses P(X,Y) = P(X)P(Y). \hspace{2 cm} http://webpages.dcu.ie/~applebyj/ms207/RV2.pdf"

So now I get that Var[S_n] = n\sigma^2.

Thanks for the help :smile:
 
Last edited by a moderator:

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 42 ·
2
Replies
42
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
0
Views
923
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K