Find mean and variance of a random vector

Click For Summary
SUMMARY

The discussion focuses on calculating the mean and variance of a sample mean random variable, denoted as ##\bar{X}##, derived from a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## where each component has the same mean ##E_X[X]## and variance ##var(X)##. It is established that the mean of ##\bar{X}## is ##E[\bar{X}] = E[X]##, while the variance is given by ##Var(\bar{X}) = \frac{var(X)}{N}##. As ##N## approaches infinity, the variance approaches zero, indicating that the distribution of ##\bar{X}## becomes increasingly concentrated around the mean, which implies that the probability mass function (PMF) of ##\bar{X}## converges to a point mass at ##E[X]##.

PREREQUISITES
  • Understanding of random vectors and their components
  • Knowledge of statistical concepts such as mean and variance
  • Familiarity with the properties of independent random variables
  • Basic grasp of probability mass functions (PMF)
NEXT STEPS
  • Study the Central Limit Theorem and its implications for sample means
  • Explore the concept of convergence in probability and its relation to variance
  • Learn about the Law of Large Numbers and its effect on sample statistics
  • Investigate the properties of the Normal distribution as it relates to large sample sizes
USEFUL FOR

Statisticians, data analysts, and students studying probability and statistics who are interested in understanding the behavior of sample means and their distributions as sample sizes increase.

Dustinsfl
Messages
2,217
Reaction score
5

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.
 
Physics news on Phys.org
Dustinsfl said:

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.

You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\<br /> \text{so}\\<br /> E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
 
  • Like
Likes   Reactions: Dustinsfl
Ray Vickson said:
You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\<br /> \text{so}\\<br /> E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
Ok so that means the variance goes to zero then but what does that mean for the PMF?
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
0
Views
990
  • · Replies 13 ·
Replies
13
Views
2K
Replies
7
Views
2K
Replies
3
Views
2K
  • · Replies 42 ·
2
Replies
42
Views
6K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K