Help with finding the expectation

  • Thread starter Thread starter cse63146
  • Start date Start date
  • Tags Tags
    Expectation
AI Thread Summary
The discussion focuses on calculating the expectation E(Y) for a random sample from a normal distribution, specifically for the expression Y = Σ((X_i - X̄)²/n). Participants emphasize the importance of breaking down the components of E[(X_i - X̄)²], including E[X_i²], E[X_i X̄], and E[X̄²], using properties of the normal distribution and chi-square distribution. The expectation of the sample mean and its variance are discussed, with clarifications on how to derive E[X²] in terms of variance and mean. The conversation highlights the necessity of understanding the relationships between these statistical moments to solve for E(Y) effectively.
cse63146
Messages
435
Reaction score
0

Homework Statement



Let X1,...,Xn denote a random sample from a N(\mu , \sigma) distribution. Let Y = \Sigma \frac{(X_i - \overline{X})^2}{n}

Homework Equations





The Attempt at a Solution



How would I find E(Y)?

Any help would be greately appreciated.
 
Physics news on Phys.org
You need to be sure you can justify steps and perform the omitted work

<br /> (X_i - \bar X)^2 = X_1^2 - 2X_1 \bar X + {\bar X}^2<br />

When you compute E[(X_i - \bar X)^2]

<br /> E[X_i^2]<br />

should be easily found, and doesn't depend on i.

<br /> 2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)]<br />

should be easily determined (and will not depend on i). Also,

<br /> E[{\bar X}^2]<br />

should be easy to find (since you know the distribution of the sample mean).

Work these out individually, then combine them.
 
X^2_i is a chi-square distribution with n degrees of freedom (since there are n Xi) and it's expectation would be n

\overline{X} -&gt; N(\mu, \frac{\sigma^2}{n}) is a noncentral chisquare distribution \frac{X^2_i}{\sigma^2 /n} and it's expectation is n*sigma^2

Kinda stuck on this one: 2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)]

I was just wondering, would I be able to use the following property of the chi square distribution:

2b87c537781cd265449ee4541fabf8ae.png
 
cse63146 said:
X^2_i is a chi-square distribution with n degrees of freedom (since there are n Xi) and it's expectation would be n

It would be a non-central Chi-square - but you don't need that. For any random variable what do you know about an expression for E[X^2] in terms of the first two moments?
\overline{X} -&gt; N(\mu, \frac{\sigma^2}{n}) is a noncentral chisquare distribution \frac{X^2_i}{\sigma^2 /n} and it's expectation is n*sigma^2
I would make a comment similar to the first one here: you know the distribution of the sample mean, what do you know about the expectation of its square in terms of the first two moments?
Kinda stuck on this one: 2E[X_i \bar X] = \frac 2 n E[X_i (X_1 + X_2 + \dots + X_n)]
Write it as
<br /> \frac 2 n \left( E[X_i^2] + \sum_{j \ne i} E[X_i X_j]\right)<br />
and remember that for i \ne j the Xs are independent.
I was just wondering, would I be able to use the following property of the chi square distribution:

2b87c537781cd265449ee4541fabf8ae.png

You could - IF you have already obtained that result elsewhere in your class.
 
Is this what you wre talking about:

E((X_i - \overline{X})^2)= \sigma^2 = E(X)^2 - E(X^2)

not sure how that helps
 
Actually, just the second part:
If
<br /> \sigma^2=E[X^2] - \left(E[X]\right)^2<br />

what does E[X^2] itself equal?
 
E(X^2_i) = \sigma^2 + E(X_i)^2 = \sigma^2 + (n \mu)^2

Would the expectation for Xbar2 be equal to the expectation of of Xi?
 
Why do you have

<br /> E(X_i)^2 = (n\mu)^2<br />

Should the n really be there?

For E[\bar X^2], remember that the sample mean is normally distributed with mean \mu and variance \frac{\sigma^2} n.
 
Isn't it because \Sigma E[X_i] = E[X_1] + ... + E[X_n] = n \mu and each E[X] = mu, so there are n of them?
 
Last edited:
  • #10
I asked because you didn't have the sum. My point is that for any random variable that has a variance,

<br /> E[X^2] = \sigma^2_X + \mu^2_x<br />

that is - the expectation of the square equals the variance plus the square of the mean.
 
Back
Top