END
- 36
- 4
Hello, PF!
[My question pertains to a non-rigorous, undergraduate introductory Probability and Statistics course. I'm no math major, so please correct me if I've mishandled any terms or concepts as I try to express myself. I'm always eager to learn!]
In a discussion of the standard deviation of a sample in relation to the 68-95-99.7 rule, the following "conceptual" example was given—or rather, made up on the spot—by our professor:
Assume \bar{x}=50 \% and s=20 \% for test scores (in units of percent correct), and assume that the sample represents the normal distribution (symmetrical and bell-shaped) of a test where no test score range below 0 \% and none above 100 \% (sorry, fellas, no extra credit).
It occurred to me that any score beyond 2.5 standard deviations would be a score of more than 100 \% or less than 0 \%. According to the three-sigma rule, this would still only encompass approximately 98.7\% of the scores meaning that approximately 1.3\% of the scores fall outside this possible range.
My question:
Is the above example even possible given the "parameters" (limits?—I can't find the right word) {0 \%}≤x_i≤{100 \%}?
And
Extrapolating this question to the overall concept, can any standard deviation s of a normal distribution ever exceed the possible range of data points/values within that distribution?
My guess is that this was simple oversight and an error on the part of my professor.
Thank you!
[My question pertains to a non-rigorous, undergraduate introductory Probability and Statistics course. I'm no math major, so please correct me if I've mishandled any terms or concepts as I try to express myself. I'm always eager to learn!]
* * *
In a discussion of the standard deviation of a sample in relation to the 68-95-99.7 rule, the following "conceptual" example was given—or rather, made up on the spot—by our professor:
Assume \bar{x}=50 \% and s=20 \% for test scores (in units of percent correct), and assume that the sample represents the normal distribution (symmetrical and bell-shaped) of a test where no test score range below 0 \% and none above 100 \% (sorry, fellas, no extra credit).
It occurred to me that any score beyond 2.5 standard deviations would be a score of more than 100 \% or less than 0 \%. According to the three-sigma rule, this would still only encompass approximately 98.7\% of the scores meaning that approximately 1.3\% of the scores fall outside this possible range.
My question:
Is the above example even possible given the "parameters" (limits?—I can't find the right word) {0 \%}≤x_i≤{100 \%}?
And
Extrapolating this question to the overall concept, can any standard deviation s of a normal distribution ever exceed the possible range of data points/values within that distribution?
My guess is that this was simple oversight and an error on the part of my professor.
Thank you!