- #1

Niles

- 1,866

- 0

In my statistics book there is an example. They say that we see 3 occurrences of type A, and theoretically expect 7. This is a difference of 4, since we know that the standard deviation is 1.65 (they calculate it), then the difference is 2.4 standard deviations. Looking at a table of the Gaussian, this is significant at the 1% level.

My question is regarding the significance at the 1% level. What is it they mean by that statement?