I know that there are many web-sites that explain Pearson's chi-square test, but they all leave the same questions unanswered. First, to make sure I have the definitions right:(adsbygoogle = window.adsbygoogle || []).push({});

1) for a fixed population with standard deviation σ,a fixed number of degrees of freedom df=k, and a fixed sample with variance s^{2}

the chi-square statistic = k*the ratio of the sample variance to the population variance = k*(s^{2}/σ^{2}), also expressed as the sum of the squares of (the difference between an observation to the expected value, as expressed in terms in units of population standard deviation).

2) For this population and this df, the chi-squared distribution is then the graph for all samples with the chi-squared statistic on the x-axis and the probability density on the y.

3) The chi-squared test uses the statistic

Ʃ (O_{i}-E_{i})^{2}/E_{i}for i values, with O_{i}being the observed frequency of the i'th value, E_{i}being its expected frequency.

OK, so far so good. But now what I do not get is the next comment: that as i goes to infinity, the chi-squared statistic approaches the chi-square distribution. First and foremost, how does a statistic, which is a single number, approach a distribution? Does it mean the cumulative distribution? Second (but not as important), is there a relatively short proof of this fact? Or at least a way to see the connection between the formulas? Thanks in advance.

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Pearson's chi-square test versus chi-squared distribution

**Physics Forums | Science Articles, Homework Help, Discussion**