Independent random varables with common expectation and variance

AI Thread Summary
The discussion centers on finding the variance of the sum of independent random variables, each with a common expectation μ and variance σ². The expected value of the sum Sn is calculated as E[Sn] = nμ. The variance is derived using the formula Var[Sn] = E[Sn²] - E[Sn]², leading to the conclusion that Var[Sn] = nσ² for independent variables. The relationship between independent random variables indicates that their covariance is zero, simplifying the variance calculation. The final result confirms that the variance of the sum is directly proportional to the number of variables multiplied by their common variance.
HotMintea
Messages
42
Reaction score
0
Homework Statement

Suppose X1 , X2 , . . . , Xn are independent random variables, with common expectation μ and variance σ^2 . Let Sn = X1 + X2 + · · · + Xn . Find the variance of Sn.

The attempt at a solution

Expected value:

E[S_n] = n E[X_i] = n\mu \hspace{10 cm} (1)

Variance:

Var[S_n] = E[S_n^2] - E[S_n]^2 = E[S_n^2] - n^2 \mu^2 \hspace{7 cm} (2) # Substituted (1).

\displaystyle E[S_n^2] = E[\sum_{i=1}^n X_i^2] + 2 E[\sum_{j=1}^n\sum_{k\ >\ j}^n X_jX_k] = n E[X_i^2] + n(n - 1) E[X_jX_k] \hspace{1 cm} (3) # Expanded Sn.

Var[ X_i ] = E[X_i^2] + E[X_i]^2 = \sigma^2\ \rightarrow\ E[X_i^2] = \sigma^2 + \mu^2 \hspace{5 cm} (4)

\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I'm stuck here.

If I knew the covariance of Xj and Xk, then I could use the following formula:

Covar[X_j, X_k] = E[X_j X_k] - E[X_j]E[X_k]

\rightarrow\ E[X_j X_k] = Covar[X_j, X_k] + E[X_j] E[X_k] = Covar[X_j, X_k] + \mu^2 \hspace{1 cm} (6)

I suspect that "independent random variables with common expectation and variance" implies a certain relation that is necessary for this question.

Can someone give me a hint please?
 
Last edited:
Physics news on Phys.org
Do independent random variables have any covariance?
 
HotMintea said:
\displaystyle E[S_n] = n(\sigma^2+\mu^2 + (n - 1) E[X_jX_k]) \hspace{7 cm} (5) # Substituted (4) into (3).

I meant E[S_n^2] =.

obafgkmrns said:
Do independent random variables have any covariance?

I found the proof of E[X]E[Y] = E[XY], if X and Y are random variables, which basically uses P(X,Y) = P(X)P(Y). \hspace{2 cm} http://webpages.dcu.ie/~applebyj/ms207/RV2.pdf"

So now I get that Var[S_n] = n\sigma^2.

Thanks for the help :smile:
 
Last edited by a moderator:
I tried to combine those 2 formulas but it didn't work. I tried using another case where there are 2 red balls and 2 blue balls only so when combining the formula I got ##\frac{(4-1)!}{2!2!}=\frac{3}{2}## which does not make sense. Is there any formula to calculate cyclic permutation of identical objects or I have to do it by listing all the possibilities? Thanks
Since ##px^9+q## is the factor, then ##x^9=\frac{-q}{p}## will be one of the roots. Let ##f(x)=27x^{18}+bx^9+70##, then: $$27\left(\frac{-q}{p}\right)^2+b\left(\frac{-q}{p}\right)+70=0$$ $$b=27 \frac{q}{p}+70 \frac{p}{q}$$ $$b=\frac{27q^2+70p^2}{pq}$$ From this expression, it looks like there is no greatest value of ##b## because increasing the value of ##p## and ##q## will also increase the value of ##b##. How to find the greatest value of ##b##? Thanks
Back
Top