Find mean and variance of a random vector

In summary, the components of a random vector have the same mean and variance, and the sample mean random variable is formed by taking the mean of all components. If the components are independent, the mean and variance of the sample mean are equal to the mean of the components and the variance of the components divided by the number of components, respectively. As the number of components tends to infinity, the variance of the sample mean tends to zero. This may indicate that the probability mass function of the sample mean also approaches a point mass at the mean of the components.
  • #1
Dustinsfl
2,281
5

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.
 
Physics news on Phys.org
  • #2
Dustinsfl said:

Homework Statement


The components of a random vector ##\mathbf{X} = [X_1, X_2, \ldots, X_N]^{\intercal}## all have the same mean ##E_X[X]## and the same variance ##var(X)##. The "sample mean" random variable
$$
\bar{X} = \frac{1}{N}\sum_{i = 1}^NX_i
$$
is formed. If the ##X_i##'s are independent, find the mean and variance of ##\hat{X}##. What happens to the variance as ##N\to\infty##? Does this tell you anything about the PMF of ##\bar{X}## as ##N\to\infty##?

Homework Equations

The Attempt at a Solution


Since the ##X_i##'s are independent,
$$
\hat{X} = \frac{1}{N}\sum_{i = 1}^NE_{X_i}[X_i]
$$
but the \(X_i\)'s have the same means, thus, all are the same so
\begin{align*}
&= \frac{1}{N}NE[X]\\
&= E[X]
\end{align*}
Additionally, independence implies that ##cov(X_i, X_j) = 0## for ##i\neq j## so the variance is
\begin{align*}
var\Big(\sum var(X_i)\Big)
&= \sum_{i = 1}^Nvar(X_i)\\
&= \sum_{i = 1}^N\big(E[X_i^2] - E^2[X_i]\big)\\
&= \sum_{i = 1}^N\big(E[X_i^2]\big) - NE^2[X]
\end{align*}
Can the variance be simplified any more? I am not sure what happens to the variance as N tends to infinity either.

You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
[tex] E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\
\text{so}\\
E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.[/tex]
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
 
  • Like
Likes Dustinsfl
  • #3
Ray Vickson said:
You are complicating things unnecessarily. If ##\mu = E\, X_i## and ##\sigma^2 = \text{Var}\, X_i## ##\forall i##, then
[tex] E\, \sum_{i=1}^n X_i = n \mu, \; \text{Var} \, \sum_{i=1}^n X_i = n \sigma^2\\
\text{so}\\
E\, \bar{X} = \mu, \; \text{Var}\, \bar{X} = \frac{\sigma^2}{n}.[/tex]
Leaving the variance in this last form is the usual way these things are done, and yields the most insight into the problems's structure.
Ok so that means the variance goes to zero then but what does that mean for the PMF?
 

1. What is a random vector?

A random vector is a collection of random variables that are grouped together as one mathematical object. It can be represented as a matrix or a list of numbers.

2. How do you find the mean of a random vector?

The mean of a random vector is calculated by taking the sum of all the values in the vector and dividing it by the total number of values. This represents the average value of the random vector.

3. What is the significance of finding the mean of a random vector?

The mean of a random vector is an important measure of central tendency. It can help in understanding the average value of the data and can be used for comparison with other data sets.

4. How is the variance of a random vector calculated?

The variance of a random vector is calculated by taking the sum of the squared differences between each value and the mean, and then dividing it by the total number of values. It measures the spread of the data and how much the values deviate from the mean.

5. What does a high or low variance of a random vector indicate?

A high variance of a random vector indicates that the data points are spread out over a larger range, while a low variance indicates that the data points are clustered closer to the mean. It helps in understanding the distribution of the data and the level of variability within the data set.

Similar threads

  • Calculus and Beyond Homework Help
Replies
0
Views
160
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Precalculus Mathematics Homework Help
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
926
  • Classical Physics
Replies
3
Views
898
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top