1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Central Limit Theorem

  1. Mar 25, 2009 #1
    1. The problem statement, all variables and given/known data

    If all possible samples of size 16 are drawn from a normal population with mean equal to 50 and standard deviation equal to 5, what is the probability that a sample mean [tex]\bar{X}[/tex] will fall in the interval from [tex]\mu_\bar{X} - 1.9 \sigma_\bar{X}[/tex] to [tex]\mu_\bar{X}-0.4\sigma_\bar{X}[/tex]?

    2. The attempt at a solution

    [tex]
    Z=\frac{\bar X - \mu}{\sigma /sqrt{n}}
    [/tex]
     
  2. jcsd
  3. Mar 25, 2009 #2

    statdad

    User Avatar
    Homework Helper

    You have all the information you need to

    * Calculate the mean and standard deviation for the sampling distribution of [tex] \overline X [/tex]

    * Calculate the two endpoints

    Find all the numbers I mention above, set up the probability statement, and
    finish just as you would for any normal distribution problem.
     
  4. Mar 26, 2009 #3
    Why is the variance smaller in a sample than in the whole population?
     
  5. Mar 26, 2009 #4

    statdad

    User Avatar
    Homework Helper

    It isn't the fact that it is a sample, it is the effect averaging the observations has on the variance of the sample mean.

    Remember that if [tex] X_1, X_2, \dots, X_n [/tex] form a random sample from any (not just a normal) distribution that has variance [tex] \sigma^2 [/tex], then for any constants

    [tex]
    \text{Var}[\sum a_i X_i] = \sum a_i^2 \sigma^2 = \sigma^2 \sum a_i^2
    [/tex]

    In the sample mean [tex] a_i = 1/n [/tex] so, the variance of the sample mean is

    [tex]
    \sigma^2 \sum \frac 1 {n^2} = \sigma^2 \frac{n}{n^2} = \frac{\sigma^2}{n}
    [/tex]
     
  6. Mar 26, 2009 #5
    So if you have a large number of samples, the variance will be zero? Zero variance means that all samples are the same, right?
     
  7. Mar 26, 2009 #6

    statdad

    User Avatar
    Homework Helper

    IF you happened to get a sample with all the numbers the same, the sample variance would be zero.

    However the variance of the distribution of [tex] \overline X [/tex] will never be zero, unless
    1) The original population variance is zero - a highly artificial situation
    2) The sample size is infinite - not possible
     
  8. Mar 27, 2009 #7
    But the sample variance will be very small when the sample size is very large...I still don't understand why. Let's say that I sample the heights of newborn babies, assuming that their heights are normally distributed around 50 cm, and with a standard deviation of 3 cm. If I sample 1000 babies, will the sample variance then be infinitisimal? I would rather expect it to be close to the polulation variance of 3 cm...
     
  9. Mar 27, 2009 #8

    statdad

    User Avatar
    Homework Helper

    I'm not sure where the "sample variance will be very small with the sample size is very large" comes from. The sample variance depends only on the numbers in the sample:
    a) if the numbers in the sample are all the same, the sample variance will be zero
    b) if the numbers in the sample are very nearly equal, the sample variance will be small (close to zero)
    c) in general, if the sample size is large, and sampling has been correctly done, we expect the sample variance will be close to the population variance, as in your "baby example"

    But the variance of the sample mean,

    [tex]
    \frac {\sigma^2} n
    [/tex]

    which you calculate when you use the CLT for probability, will be close to zero when
    the sample size is large, simply because [tex] n [/tex] occurs in the denominator
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Central Limit Theorem
  1. Central limit theorem (Replies: 1)

  2. Central Limit Theorem (Replies: 2)

Loading...