Solving Variance Problem: Computing E(\hat{\theta}) and E(\hat{\theta}^2)

  • Thread starter mnb96
  • Start date
  • Tags
    Variance
In summary: The mistake in computing the variance of the estimator was dropping the v_i^2 term. The correct variance is Var(\hat{\theta}) = \frac{\sigma^2}{n}.
  • #1
mnb96
715
5
Hello,
we are given N independent random variables [itex]z_i[/itex] defined as follows:

[tex]z_i = \theta + v_i[/tex]

where the r.v. [itex]v_i[/itex] are zero-mean normal distributions [itex]v_i \sim N(0,\sigma^2)[/itex].

I want to compute the variance of the estimator

[tex]\hat{\theta}=\frac{1}{n}\sum_{i=1}^n z_i[/tex]

However I can't seem to get the right result (I get always zero).

I computed first [itex]E(\hat{\theta})^2=\theta^2[/itex].

Then I try to compute [itex]E(\hat{\theta^2})[/itex] as follows:

[tex]\frac{1}{n^2}E\left[\left(\sum_{i=1}^n z_i \right)^2\right] = \frac{1}{n^2}E\left[\left(\sum_{i=1}^n (z_i)^2 \right) + \left(\sum_{i=1}^n z_i \right) \left( \sum_{j\neq i} z_j \right) \right][/tex]

[tex]= \frac{1}{n^2}E\left[\sum_{i=1}^n z_i^2 \right] + \frac{n-1}{n}\theta^2[/tex]

[tex]= \frac{1}{n^2}E\left[\sum_{i=1}^n (\theta^2 +2\theta v_i + v_i^2) \right] + \frac{n-1}{n}\theta^2[/tex]

[tex]= \frac{1}{n}\theta^2 + \frac{n-1}{n}\theta^2[/tex]

From this we get [tex]Var(\hat{\theta})=E(\hat{\theta}^2)-E(\hat{\theta})^2 = 0[/tex].
Where is the mistake?
 
Physics news on Phys.org
  • #2
mnb96 said:
[tex] \frac{1}{n^2}E\left[\sum_{i=1}^n (\theta^2 +2\theta v_i + v_i^2) \right] + \frac{n-1}{n}\theta^2[/tex]

[tex]= \frac{1}{n}\theta^2 + \frac{n-1}{n}\theta^2[/tex]

This is wrong. You dropped the [itex]v_i^2[/itex] term.
 
  • #3
uh...that's true! Thanks.

[tex]E(v_i^2) = \frac{1}{\sqrt{2\pi \sigma^2}}\int_\mathbb{R} x^2 e^\frac{-x^2}{2\sigma^2} dx[/tex]

This integral is indeed [tex]> \\ 0[/tex]
 
Last edited:
  • #4
mnb96 said:
uh...that's true! Thanks.

[tex]E(v_i^2) = \frac{1}{\sqrt{2\pi \sigma^2}}\int_\mathbb{R} x^2 e^\frac{-x^2}{2\sigma^2} dx[/tex]

This integral is indeed [tex]> \\ 0[/tex]

In fact, [itex]E(v_i^2) = \sigma^2[/itex].
 
  • #5


Hello,

It seems like the mistake is in your calculation of E(\hat{\theta}^2). You have correctly identified that E(\hat{\theta}^2) can be written as the sum of two terms, but the second term should be \frac{1}{n}\sum_{i\neq j}E(z_i z_j). This is because the second term in your calculation should reflect the covariance between the different z_i values, not just the product of the expected values. You can then use the definition of z_i to calculate E(z_i z_j) and plug it into the equation to get the correct variance for \hat{\theta}.
 

What is variance and why is it important to compute E(\hat{\theta}) and E(\hat{\theta}^2)?

Variance is a measure of how spread out a set of data is from its mean. It is important to compute E(\hat{\theta}) and E(\hat{\theta}^2) because they are used to estimate the true value of a parameter, \theta, from a sample of data.

How do you solve for E(\hat{\theta}) and E(\hat{\theta}^2)?

To solve for E(\hat{\theta}) and E(\hat{\theta}^2), you need to use the formula: E(\hat{\theta}) = \sum{\hat{\theta_i} P(\hat{\theta_i})} and E(\hat{\theta}^2) = \sum{\hat{\theta_i}^2 P(\hat{\theta_i})}, where \hat{\theta_i} represents the values of the estimator and P(\hat{\theta_i}) represents the probability of each value occurring.

What is the difference between E(\hat{\theta}) and E(\hat{\theta}^2)?

E(\hat{\theta}) represents the expected value or average of the estimator, while E(\hat{\theta}^2) represents the expected value or average of the squared estimator. E(\hat{\theta}^2) is used to measure the spread or variability of the estimator.

What is the significance of computing E(\hat{\theta}) and E(\hat{\theta}^2) in statistical analysis?

Computing E(\hat{\theta}) and E(\hat{\theta}^2) allows us to estimate the true value of a parameter from a sample of data. This is important in statistical analysis because it helps us make inferences about a population based on a smaller sample. It also allows us to compare different estimators and determine which one is the most accurate.

What are some common challenges when solving for E(\hat{\theta}) and E(\hat{\theta}^2)?

Some common challenges when solving for E(\hat{\theta}) and E(\hat{\theta}^2) include dealing with large or complex data sets, ensuring the estimator is unbiased, and determining the appropriate probability distribution to use. It is also important to consider the assumptions and limitations of the estimator being used.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
742
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
958
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
853
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
722
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
935
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
737
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
Back
Top