Help with finding the expectation

  • Thread starter Thread starter cse63146
  • Start date Start date
  • Tags Tags
    Expectation
Click For Summary

Homework Help Overview

The discussion revolves around finding the expectation of a specific expression involving a random sample from a normal distribution. The problem involves calculating the expectation of Y, defined as the sum of squared deviations from the sample mean.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the calculation of E(Y) by breaking down the expression for (X_i - \overline{X})^2 and discussing the expectations of various components. There are attempts to clarify the relationship between the sample mean and the individual random variables.

Discussion Status

Participants have offered guidance on how to compute individual expectations and have raised questions about the assumptions underlying the calculations. There is an ongoing exploration of the properties of the chi-square distribution and its relation to the problem, with no explicit consensus reached.

Contextual Notes

Some participants question the definitions and relationships between the moments of the random variables involved, particularly regarding the sample mean and the implications of independence among the variables.

cse63146
Messages
435
Reaction score
0

Homework Statement



Let X1,...,Xn denote a random sample from a [tex]N(\mu , \sigma)[/tex] distribution. Let [tex]Y = \Sigma \frac{(X_i - \overline{X})^2}{n}[/tex]

Homework Equations





The Attempt at a Solution



How would I find E(Y)?

Any help would be greately appreciated.
 
Physics news on Phys.org
You need to be sure you can justify steps and perform the omitted work

[tex] (X_i - \bar X)^2 = X_1^2 - 2X_1 \bar X + {\bar X}^2[/tex]

When you compute [tex]E[(X_i - \bar X)^2][/tex]

[tex] E[X_i^2][/tex]

should be easily found, and doesn't depend on i.

[tex] 2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)][/tex]

should be easily determined (and will not depend on i). Also,

[tex] E[{\bar X}^2][/tex]

should be easy to find (since you know the distribution of the sample mean).

Work these out individually, then combine them.
 
[tex]X^2_i[/tex] is a chi-square distribution with n degrees of freedom (since there are n Xi) and it's expectation would be n

[tex]\overline{X} -> N(\mu, \frac{\sigma^2}{n})[/tex] is a noncentral chisquare distribution [tex]\frac{X^2_i}{\sigma^2 /n}[/tex] and it's expectation is n*sigma^2

Kinda stuck on this one: [tex]2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)][/tex]

I was just wondering, would I be able to use the following property of the chi square distribution:

2b87c537781cd265449ee4541fabf8ae.png
 
cse63146 said:
[tex]X^2_i[/tex] is a chi-square distribution with n degrees of freedom (since there are n Xi) and it's expectation would be n

It would be a non-central Chi-square - but you don't need that. For any random variable what do you know about an expression for [tex]E[X^2][/tex] in terms of the first two moments?
[tex]\overline{X} -> N(\mu, \frac{\sigma^2}{n})[/tex] is a noncentral chisquare distribution [tex]\frac{X^2_i}{\sigma^2 /n}[/tex] and it's expectation is n*sigma^2
I would make a comment similar to the first one here: you know the distribution of the sample mean, what do you know about the expectation of its square in terms of the first two moments?
Kinda stuck on this one: [tex]2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)][/tex]
Write it as
[tex] \frac 2 n \left( E[X_i^2] + \sum_{j \ne i} E[X_i X_j]\right)[/tex]
and remember that for [tex]i \ne j[/tex] the Xs are independent.
I was just wondering, would I be able to use the following property of the chi square distribution:

2b87c537781cd265449ee4541fabf8ae.png

You could - IF you have already obtained that result elsewhere in your class.
 
Is this what you wre talking about:

[tex]E((X_i - \overline{X})^2)= \sigma^2 = E(X)^2 - E(X^2)[/tex]

not sure how that helps
 
Actually, just the second part:
If
[tex] \sigma^2=E[X^2] - \left(E[X]\right)^2[/tex]

what does [tex]E[X^2][/tex] itself equal?
 
[tex]E(X^2_i) = \sigma^2 + E(X_i)^2 = \sigma^2 + (n \mu)^2[/tex]

Would the expectation for Xbar2 be equal to the expectation of of Xi?
 
Why do you have

[tex] E(X_i)^2 = (n\mu)^2[/tex]

Should the [tex]n[/tex] really be there?

For [tex]E[\bar X^2][/tex], remember that the sample mean is normally distributed with mean [tex]\mu[/tex] and variance [tex]\frac{\sigma^2} n[/tex].
 
Isn't it because [tex]\Sigma E[X_i] = E[X_1] + ... + E[X_n] = n \mu[/tex] and each E[X] = mu, so there are n of them?
 
Last edited:
  • #10
I asked because you didn't have the sum. My point is that for any random variable that has a variance,

[tex] E[X^2] = \sigma^2_X + \mu^2_x[/tex]

that is - the expectation of the square equals the variance plus the square of the mean.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
Replies
2
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
1
Views
4K
  • · Replies 31 ·
2
Replies
31
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K