Find mean and variance of a random vector

Dustinsfl
Messages
2,217
Reaction score
5

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.
 
Physics news on Phys.org
Dustinsfl said:

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.

You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\<br /> \text{so}\\<br /> E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
 
  • Like
Likes Dustinsfl
Ray Vickson said:
You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\<br /> \text{so}\\<br /> E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
Ok so that means the variance goes to zero then but what does that mean for the PMF?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top