# Find mean and variance of a random vector

1. Oct 27, 2014

### Dustinsfl

1. The problem statement, all variables and given/known data
The components of a random vector $\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}$ all have the same mean $E_X[X]$ and the same variance $var(X)$. The "sample mean" random variable
$$\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i$$
is formed. If the $X_i$'s are independent, find the mean and variance of $\hat{X}$. What happens to the variance as $N\to\infty$? Does this tell you anything about the PMF of $\bar{X}$ as $N\to\infty$?

2. Relevant equations

3. The attempt at a solution
Since the $X_i$'s are independent,
$$\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]$$
but the $$X_i$$'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that $cov(X_i, X_j) = 0$ for $i\neq j$ so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.

2. Oct 27, 2014

### Ray Vickson

You are complicating things unnecessarily. If $\mu = E\, X_i$ and $\sigma^2 = \text{Var}\, X_i$ $\forall i$, then
$$E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\ \text{so}\\ E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.$$
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.

3. Oct 28, 2014

### Dustinsfl

Ok so that means the variance goes to zero then but what does that mean for the PMF?