Mean/ standard deviation

In summary, for an airline's flights with a mean arrival time of 5.8 min late and a standard deviation of 2.1 min, the appropriate x-scale for a histogram would be 3-5 standard deviations on each side of the mean. This is based on the rule of thumb for the normal distribution, where the probability of finding a value outside 3-5 standard deviations is about 5%. This can also be seen in the plot of the normal distribution.
  • #1
whitehorsey
192
0
1. An airline's flights have a mean arrival time of 5.8 min late with standard deviation of 2.1 min. What would be the appropriate x - scale for a histogram?



2. http://www.mathsrevision.net/gcse/sdeviation2.gif [Broken]



3. I tried to do this problem, but I couldn't think of anyway to find the x - scale. Can you please explain to me how? Thank You!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
I don't know if this is really what they are asking, but in a normal distribution probability of finding a value outside two standard deviations (on either side) of the mean is about 5%. So usually, if you fit 3 ~ 5 standard deviations on your x-scale you are good.
 
  • #3
CompuChip said:
I don't know if this is really what they are asking, but in a normal distribution probability of finding a value outside two standard deviations (on either side) of the mean is about 5%. So usually, if you fit 3 ~ 5 standard deviations on your x-scale you are good.

There asking for what you said, but I don't get what u mean about the 5% and how did u get 3 ~ 5 standard deviations.
 
  • #4

What is the mean?

The mean, also known as the average, is a measure of central tendency that represents the sum of all values in a dataset divided by the number of values in the dataset.

How is the mean calculated?

The mean is calculated by adding up all the values in a dataset and then dividing the sum by the number of values in the dataset. This is represented by the formula:
Mean = (x1 + x2 + x3 + ... + xn) / n, where x1, x2, x3, ... xn are the values in the dataset and n is the number of values.

What is standard deviation?

The standard deviation is a measure of variability that quantifies the amount of dispersion of a dataset around the mean. It tells us how much the values in a dataset deviate from the mean.

How is standard deviation calculated?

The standard deviation is calculated by finding the difference between each value in a dataset and the mean, squaring those differences, adding them up, dividing by the number of values, and then taking the square root. This is represented by the formula:
Standard deviation = √[(x1-mean)^2 + (x2-mean)^2 + (x3-mean)^2 + ... + (xn-mean)^2] / n, where x1, x2, x3, ... xn are the values in the dataset, mean is the mean of the dataset, and n is the number of values.

What does the standard deviation tell us?

The standard deviation tells us how much the values in a dataset vary from the mean. A smaller standard deviation indicates that the values are closer to the mean, while a larger standard deviation indicates that the values are more spread out. It is a useful measure for understanding the distribution of a dataset and identifying outliers.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
4
Views
2K
  • Precalculus Mathematics Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
784
  • General Math
Replies
6
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
11K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
13
Views
1K
  • Precalculus Mathematics Homework Help
Replies
5
Views
4K
Back
Top