nobahar said:
Hello!
Quick question:
Is this valid:
(\frac{standard \left \left deviation}{average} * 100)?
The standard deviation is just a measure of the spread of the data from the average, so by doing the above, is it retaining its 'representation' as a measure of spread, just as a percentage of the average form?
Thanks in advance.
this is the definition of the coefficient of variation, a measure of relative dispersion. Technically it gives the standard deviation as a percentage of the mean, but the intuitive use is this: the larger this value is, the greater the spread of the numbers around their mean.
It springs from normal distributions. consider these two situations.
a) A population with incomes that are normally distributed with mean $50,000 and standard deviation $5000. If I pick the +- 2 standard deviation spread, it is $40000 to $60000, a
range of ten thousand. It is a simple matter to find the percentage of incomes in this range. Here the coefficient of variation is 10% (5000 is 10% of 50000)
2) Now consider another group of incomes, normally distributed, with mean of $2,000,000
(two million dollars) and standard deviation $200,000. The +- two standard deviation spread is $1,600,000 to $2,400,000 - eight hundred thousand dollars. On an absolute scale (purely in dollars) income is much more widely spread here than in 'a' - that's the influence of the larger standard deviation. Notice, however, that the percentage of incomes in this range will be the same as it was in 'a'. Here also, the coefficient of variation is 10%: relatively speaking, the incomes in this group are spread around the mean in the same pattern as those in 'a'
The intuitive use of this quantity is to compare relative variability among several groups of data. The larger this coefficient, the greater the variability. the only restrictions is that it applies only when the measurements are positive.