# Independent random varables with common expectation and variance

Homework Statement

Suppose X1 , X2 , . . . , Xn are independent random variables, with common expectation μ and variance σ^2 . Let Sn = X1 + X2 + · · · + Xn . Find the variance of Sn.

The attempt at a solution

Expected value:

$E[S_n] = n E[X_i] = n\mu \hspace{10 cm}$ (1)

Variance:

$Var[S_n] = E[S_n^2] - E[S_n]^2 = E[S_n^2] - n^2 \mu^2 \hspace{7 cm}$ (2) # Substituted (1).

$\displaystyle E[S_n^2] = E[\sum_{i=1}^n X_i^2] + 2 E[\sum_{j=1}^n\sum_{k\ >\ j}^n X_jX_k] = n E[X_i^2] + n(n - 1) E[X_jX_k] \hspace{1 cm}$ (3) # Expanded Sn.

$Var[ X_i ] = E[X_i^2] + E[X_i]^2 = \sigma^2\ \rightarrow\ E[X_i^2] = \sigma^2 + \mu^2 \hspace{5 cm}$ (4)

$\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm}$ (5) # Substituted (4) into (3).

I'm stuck here.

If I knew the covariance of Xj and Xk, then I could use the following formula:

$Covar[X_j, X_k] = E[X_j X_k] - E[X_j]E[X_k]$

$\rightarrow\ E[X_j X_k] = Covar[X_j, X_k] + E[X_j] E[X_k] = Covar[X_j, X_k] + \mu^2 \hspace{1 cm}$ (6)

I suspect that "independent random variables with common expectation and variance" implies a certain relation that is necessary for this question.

Can someone give me a hint please?

Last edited:

Related Precalculus Mathematics Homework Help News on Phys.org
Do independent random variables have any covariance?

$\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm}$ (5) # Substituted (4) into (3).
I meant $E[S_n^2] =$.

Do independent random variables have any covariance?
I found the proof of $E[X]E[Y] = E[XY]$, if X and Y are random variables, which basically uses $P(X,Y) = P(X)P(Y). \hspace{2 cm}$ http://webpages.dcu.ie/~applebyj/ms207/RV2.pdf" [Broken]

So now I get that $Var[S_n] = n\sigma^2$.

Thanks for the help Last edited by a moderator: