Measurement Error

  • Thread starter Watts
  • Start date
38
0

Main Question or Discussion Point

Could some please distinguish the difference between measurement error and measurement uncertainty. I have actually held conversations at conferences over this issue with statisticians from NIST and no one has ever been able to give me a consistent answer. It seems to be more philosophical than anything. I was just wondering if any body could shed some thoughts on this subject?
 
Last edited:

Answers and Replies

EnumaElish
Science Advisor
Homework Helper
2,285
123
See Uncertainty. My "educated guess" is that uncertainty is minimized when, say, all error terms (in the sense of residual = measured value - true value) are either -1 or +1. But error can further be minimized when all error terms are randomly distributed, say, between -0.01 and +0.01, in which case uncertainty may be greater than in the previous (binary) case because eror terms are "all over" (albeit within a tiny interval).
 
The former is mean-actual value and the latter is the variance
 
38
0
Thanks

That helps matters. The statement made in the link ("Because of an unfortunate use of terminology in systems analysis discourse, the word "uncertainty" has both a precise technical meaning and its loose natural meaning of an event or situation that is not certain.") provided still leaves things open for discussion. I will study the information provided and the link information further. On a second note it is basically another way of measuring dispersion.
 
EnumaElish
Science Advisor
Homework Helper
2,285
123
balakrishnan_v said:
The former is mean-actual value and the latter is the variance
You are saying that uncertainty is identical to + standard deviation; is that correct?
 

Related Threads for: Measurement Error

Replies
4
Views
686
Replies
3
Views
2K
Replies
1
Views
1K
Replies
2
Views
6K
Top