Standard Deviation Word Problem

In summary, the homework problem discusses the average and standard deviation of time spent online and asks for the value that is one standard deviation below the mean. However, since the standard deviation is larger than the mean, the resulting value would be negative, which is not possible in this scenario. This highlights the limitations of assuming a normal distribution for all situations.
  • #1
kwikness
17
0

Homework Statement


It has been projected that the average and standard deviation of the amount of time spent online using the Internet are, respectively, 14 and 17 hours per person per year (approximately normally distributed). What value is exactly 1 standard deviation below the mean?

Homework Equations


Emiprical Rule
[tex]\mu \pm \sigma[/tex] contains approximately 68% of the measurements.
[tex]\mu \pm 2\sigma[/tex] contains approximately 95% of the measurements.
[tex]\mu \pm 3\sigma[/tex] contains approximately almost all of the measurements.

The Attempt at a Solution


In similar problems, the mean is the larger number in the problem, so solving the problem is a simple matter of subtracting the standard deviation from the mean to find out the percentage of population.

In this case though, the standard deviation (17) is greater than the mean(14)? If I solve this like I do normal problems, this would leave me with a negative value for time spent on the Internet.

Is this an error in the textbook or is there something I'm missing here?
 
Physics news on Phys.org
  • #2
My guess is that the problem is meant to emphasize the shortcomings of assuming a normal distribution, e.g. the normal distribution spans the entire real line while examples may have a restricted domain, as in this case only positive reals.
 
  • #3
Thanks
 
  • #4
It should be used to reinforce the idea that assuming things are normally distributed isn't always justified: the actual question as you describe it shows that the times can't be normal, for the reason you point out.
 

What is standard deviation?

Standard deviation is a measure of how spread out a set of data is from the average. It tells you how much the data deviates from the mean.

How is standard deviation calculated?

To calculate standard deviation, you first find the mean of the data. Then, for each data point, you subtract the mean and square the result. Next, find the average of these squared differences. Finally, take the square root of this average to get the standard deviation.

What does a high standard deviation mean?

A high standard deviation means that the data points are spread out from the mean. This indicates that the values in the data set are more varied and not as close to the average.

What does a low standard deviation mean?

A low standard deviation means that the data points are close to the mean. This suggests that the values in the data set are similar and not as varied.

How is standard deviation used in data analysis?

Standard deviation is used to measure the amount of variation or dispersion in a set of data. It is often used to determine the reliability of data and to compare the spread of data between different groups or samples. It can also be used to identify outliers in a data set.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
5
Views
8K
  • General Math
Replies
2
Views
772
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
778
  • Precalculus Mathematics Homework Help
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
7
Views
5K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Precalculus Mathematics Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
Back
Top