Std dev is different depending on the scale?

  • Context: Undergrad 
  • Thread starter Thread starter Lobotomy
  • Start date Start date
  • Tags Tags
    Scale
Click For Summary
SUMMARY

The discussion clarifies that standard deviation (std dev) is dependent on the scale of measurement used. When measuring lengths in millimeters, a variance of 100 mm² results in a std dev of 10 mm. Conversely, when the same lengths are measured in meters, the variance converts to 0.0001 m², leading to a std dev of 0.01 m, which is equivalent to 10 mm. This highlights the importance of maintaining consistent units when calculating variance and standard deviation.

PREREQUISITES
  • Understanding of variance and standard deviation concepts
  • Familiarity with units of measurement (millimeters and meters)
  • Basic knowledge of square roots and their applications
  • Concept of unit conversion in statistical calculations
NEXT STEPS
  • Research the implications of unit conversion on statistical measures
  • Learn about variance and standard deviation in different measurement systems
  • Explore the relationship between variance, standard deviation, and data distribution
  • Study the mathematical properties of square roots in statistical contexts
USEFUL FOR

Statisticians, data analysts, students studying statistics, and anyone involved in quantitative research who needs to understand the impact of measurement scales on statistical calculations.

Lobotomy
Messages
55
Reaction score
0
hello
std dev is the square root of the variance.

assume we measure lenghts of something normally distributed. we use millimeter.
we calculate our variance to be 100mm and thus std dev to be sqrt(100)=10mm

but, if we instead would measure the same objects in meter, then we'd get the variance to be 0.1m (exactely the same as 100mm) but then the std dev is sqrt(0.1)=0.3162 which is 316mm!

so have our std dev suddenly increased from 10mm to 316mm just by using a different scale when measureing the same objects?
 
Physics news on Phys.org
Your variance would be 100 mm2 not just mm (you should check that the units of variance is that of the variable squared), so when converting to mm you need to convert to mm2

Otherwise when you took the square root your standard deviation would have units square root of a meter, which is pretty weird
 
Office_Shredder said:
Your variance would be 100 mm2 not just mm (you should check that the units of variance is that of the variable squared), so when converting to mm you need to convert to mm2

Otherwise when you took the square root your standard deviation would have units square root of a meter, which is pretty weird

ok i see. thanks.

so first time we had a variance of 100mm^2 and std dev of 10mm
then we measure in meter.
so that means we have a variance of 0.0001 m^2 (0.0001m^2=100mm^2).
so we calculate the std dev sqrt(0.0001)=0.01 which is equal to 10mm! makes more sense
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 25 ·
Replies
25
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 12 ·
Replies
12
Views
5K