Average of standard deviation makes sense?

In summary, the conversation discusses the use of average and standard deviation in analyzing test results. The example shows the average and standard deviation for different sizes of tests, and the question is raised whether it makes sense to calculate the mean of the standard deviations. The conversation also mentions the use of variances in process improvement and the need for understanding variance in order to reduce total variation. The conversation ends with a question about whether all readers are familiar with variance and standard deviation.
  • #1
xeon123
90
0
I've a set of tests, and for each test, they have different size.

I'll show an example.

Code:
Test1
size - Time(seconds)
100 - 10
100 - 23
100 - 17
200 - 37
200 - 42
200 - 47
300 - 53
300 - 53
300 - 53

For each test, I took the average.
Code:
Average1
size - average
100 - 16
200 - 42
300 - 53

And for each entry of the Test1, I took the standard deviation. Does't in make sense calculate the mean of the standard deviation?

Doing an average of standard deviation can proof that all examples ran at similar time? For example, having low average in the standard deviation means that all the 3 results of 100 were similar among themselves, the 3 results of 200 were similar results among themselves, and the 3 results of 300 were similar among themselves?
 
Mathematics news on Phys.org
  • #2
There are a few procedures used to test the hypothesis that the variances of two groups are equal, but they tend to be extremely unreliable (especially for your small N).

The standard deviation has a distribution, of course, and the distribution tends to be pretty normal for large N (which you don't have, so your distribution is going to be quite skewed). You're not going to be able to say much with confidence; the best thing to do may be to just point to the standard deviations of the three groups and note that they're pretty close to each other.
 
  • #3
I just showed here a small example. I've a set with N=100.
 
  • #4
When combining data sets you should work with the variances and not the standard deviations.
 
  • #5
mathman said:
When combining data sets you should work with the variances and not the standard deviations.

Where variance is SD2.

So I presume that SD(total) = √(SD12 + SD22...SDn2)

And SD(mean) = SD(total)/n

While I can't see a good use for SD(mean), being able to understand SD(total) is a powerful tool in process improvement. Often you can't reduce all variation in a process but you can sure tackle some of them individually and reduce the total that way.
 
  • #6
Bandit127 said:
Where variance is SD2.

So I presume that SD(total) = √(SD12 + SD22...SDn2)

And SD(mean) = SD(total)/n

While I can't see a good use for SD(mean), being able to understand SD(total) is a powerful tool in process improvement. Often you can't reduce all variation in a process but you can sure tackle some of them individually and reduce the total that way.
Do you have a further question?
 
  • #7
Since you have not taken issue with it, I presume the workings are correct.

Should I take for granted that all the readers of this thread know what variance is and how to sum and take the average of standard deviations? If so my post is redundant.

If not you could have prevented me posting by explaining it yourself.
 

FAQ: Average of standard deviation makes sense?

What is the average of standard deviation?

The average of standard deviation is a measure of spread or variability in a set of data. It is calculated by taking the sum of the squared differences between each data point and the mean, dividing by the total number of data points, and then taking the square root of that value.

Why is it important to calculate the average of standard deviation?

Calculating the average of standard deviation allows us to understand the spread of data points around the mean. This information is useful in determining the consistency or variability of a dataset, as well as identifying any outliers or unusual data points.

How is the average of standard deviation different from the mean?

The mean is a measure of central tendency, or the average value of a dataset. The average of standard deviation, on the other hand, measures the spread or variability of the data. While the mean is affected by extreme values, the average of standard deviation is not as heavily influenced by outliers.

Can the average of standard deviation be negative?

No, the average of standard deviation cannot be negative. This is because the calculation involves taking the square root of the squared differences between data points and the mean. The square root of a squared value will always result in a positive number.

How can the average of standard deviation be used in statistical analysis?

The average of standard deviation can be used in statistical analysis to compare the variability of different datasets, identify patterns or trends, and make predictions about future data points. It can also be used to determine the confidence interval for a particular set of data, which is a range of values within which the true mean is likely to fall.

Similar threads

Replies
5
Views
2K
Replies
22
Views
3K
Replies
10
Views
2K
Replies
2
Views
1K
Replies
4
Views
3K
Replies
9
Views
3K
Replies
6
Views
3K
Back
Top