Measurement Error vs Uncertainty: Philosophical Differences

AI Thread Summary
Measurement error refers to the difference between a measured value and the true value, while measurement uncertainty encompasses the range of values within which the true value is expected to lie, reflecting the confidence in the measurement. The discussion highlights that uncertainty can be minimized through precise measurements, but may paradoxically increase with tightly clustered errors if they are randomly distributed. Philosophically, the terms can be confusing due to their dual meanings in technical and everyday contexts. The conversation emphasizes the need for clarity in definitions, particularly in statistical discourse. Ultimately, understanding these concepts is crucial for accurate data interpretation and analysis.
Watts
Messages
37
Reaction score
0
Could some please distinguish the difference between measurement error and measurement uncertainty. I have actually held conversations at conferences over this issue with statisticians from NIST and no one has ever been able to give me a consistent answer. It seems to be more philosophical than anything. I was just wondering if anybody could shed some thoughts on this subject?
 
Last edited:
Physics news on Phys.org
See Uncertainty. My "educated guess" is that uncertainty is minimized when, say, all error terms (in the sense of residual = measured value - true value) are either -1 or +1. But error can further be minimized when all error terms are randomly distributed, say, between -0.01 and +0.01, in which case uncertainty may be greater than in the previous (binary) case because eror terms are "all over" (albeit within a tiny interval).
 
The former is mean-actual value and the latter is the variance
 
Thanks

That helps matters. The statement made in the link ("Because of an unfortunate use of terminology in systems analysis discourse, the word "uncertainty" has both a precise technical meaning and its loose natural meaning of an event or situation that is not certain.") provided still leaves things open for discussion. I will study the information provided and the link information further. On a second note it is basically another way of measuring dispersion.
 
balakrishnan_v said:
The former is mean-actual value and the latter is the variance
You are saying that uncertainty is identical to + standard deviation; is that correct?
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top