Independent random varables with common expectation and variance

Click For Summary
The discussion centers on finding the variance of the sum of independent random variables, each with a common expectation μ and variance σ². The expected value of the sum Sn is calculated as E[Sn] = nμ. The variance is derived using the formula Var[Sn] = E[Sn²] - E[Sn]², leading to the conclusion that Var[Sn] = nσ² for independent variables. The relationship between independent random variables indicates that their covariance is zero, simplifying the variance calculation. The final result confirms that the variance of the sum is directly proportional to the number of variables multiplied by their common variance.
HotMintea
Messages
42
Reaction score
0
Homework Statement

Suppose X1 , X2 , . . . , Xn are independent random variables, with common expectation μ and variance σ^2 . Let Sn = X1 + X2 + · · · + Xn . Find the variance of Sn.

The attempt at a solution

Expected value:

E[S_n] = n E[X_i] = n\mu \hspace{10 cm} (1)

Variance:

Var[S_n] = E[S_n^2] - E[S_n]^2 = E[S_n^2] - n^2 \mu^2 \hspace{7 cm} (2) # Substituted (1).

\displaystyle E[S_n^2] = E[\sum_{i=1}^n X_i^2] + 2 E[\sum_{j=1}^n\sum_{k\ >\ j}^n X_jX_k] = n E[X_i^2] + n(n - 1) E[X_jX_k] \hspace{1 cm} (3) # Expanded Sn.

Var[ X_i ] = E[X_i^2] + E[X_i]^2 = \sigma^2\ \rightarrow\ E[X_i^2] = \sigma^2 + \mu^2 \hspace{5 cm} (4)

\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I'm stuck here.

If I knew the covariance of Xj and Xk, then I could use the following formula:

Covar[X_j, X_k] = E[X_j X_k] - E[X_j]E[X_k]

\rightarrow\ E[X_j X_k] = Covar[X_j, X_k] + E[X_j] E[X_k] = Covar[X_j, X_k] + \mu^2 \hspace{1 cm} (6)

I suspect that "independent random variables with common expectation and variance" implies a certain relation that is necessary for this question.

Can someone give me a hint please?
 
Last edited:
Physics news on Phys.org
Do independent random variables have any covariance?
 
HotMintea said:
\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I meant E[S_n^2] =.

obafgkmrns said:
Do independent random variables have any covariance?

I found the proof of E[X]E[Y] = E[XY], if X and Y are random variables, which basically uses P(X,Y) = P(X)P(Y). \hspace{2 cm} http://webpages.dcu.ie/~applebyj/ms207/RV2.pdf"

So now I get that Var[S_n] = n\sigma^2.

Thanks for the help :smile:
 
Last edited by a moderator:

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 42 ·
2
Replies
42
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
0
Views
847
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K