Accuracy & Precision: Definition, Understanding & Examples

  • Context: High School 
  • Thread starter Thread starter Hlud
  • Start date Start date
  • Tags Tags
    Accuracy Definition
Click For Summary

Discussion Overview

The discussion centers around the definitions and distinctions between accuracy and precision in measurements, exploring their implications in scientific contexts. Participants share their interpretations, examples, and challenges in defining these concepts, particularly in relation to the existence of 'true' values in science.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants define precision as both the repeatability of measurements and their specificity, while accuracy is often described as how close measurements are to a 'true' value.
  • There is a contention regarding the existence of a 'true' value in science, with some arguing that accepted values are based on rigorous testing and repeated experiments.
  • One participant mentions that while there may not be a definitive 'true value', confidence in accepted values can arise from multiple supporting measurements, citing the charge of an electron as an example.
  • Another participant suggests that international standards maintained by National Measurement Institutes provide a practical reference for 'true values' in measurements.
  • There is a proposal that accuracy relates to systematic errors while precision pertains to random errors, which some participants agree with.
  • The analogy of a dartboard is used to illustrate the relationship between accuracy and precision, where tightly clustered darts indicate precision and darts near the bullseye indicate accuracy.
  • Participants note the difficulty in estimating accuracy without knowing the 'ground truth' to compare against, raising questions about the purpose of measurements in such cases.
  • Examples are shared to illustrate the difference between precision and accuracy, including anecdotes about measurement devices and historical records.

Areas of Agreement / Disagreement

Participants express various interpretations of accuracy and precision, with some agreeing on certain characterizations while others raise questions or provide counterexamples. The discussion remains unresolved regarding the definitive nature of 'true' values and the implications for measurement accuracy.

Contextual Notes

Participants highlight the limitations of defining accuracy and precision without a clear understanding of 'ground truth' and the challenges in measuring certain constants, such as the universal gravitation constant.

Hlud
Messages
72
Reaction score
6
Hey, i am trying to teach accuracy and precision. I understand precision to mean two things. How repeatable measurements are (variation), and how specific measurements are (sensitivity).

I am having a bigger problem defining accuracy. Most places i look define accuracy as how close measurements are to some true value. However, there is no 'true' value in science. Only a value that is accepted to be 'true'. How accurate are true values? How do we determine the accuracy of these values?
 
Physics news on Phys.org
My interpretation is that after experiments are conducted and the same result is yielded, to some order of magnitude, it becomes accepted that the experiment should yield that value that is accepted. Only rigorous testing concludes that the speed of light is roughly 3E8 m/s in a true sense. However, you can derive the speed of light using other variables that also have accepted values through experimentation or derivation. That's my take on it anyway.

Oh, also to the point that no measurement is absolute/true in experimentation: That is correct but we generally stop at some order of magnitude as it is unnecessary for our purposes to know the 90th decimal value of pi when we can just use 3.141592. (that is unless you are a mathematician)
 
Hlud said:
Hey, i am trying to teach accuracy and precision. I understand precision to mean two things. How repeatable measurements are (variation), and how specific measurements are (sensitivity).

That's a good understanding of what precision means.

Hlud said:
there is no 'true' value in science

In a sense there is no way of knowing what the 'true value' of any measurement is, but that doesn't mean they don't exist. For example, I can imaging that there truly is a single value for the charge of an electron. When Millikan first performed the oil drop experiment to measure this charge he was able to state the results in terms of the precision he thought he had attained. The experiment and others like it was repeated by others and we now have the current accepted value for the charge on the electron. Is this the true value? No - there is uncertainty (and good texts will include the uncertainty). But, because there have been many measurements of the charge of an electron that support each other we can be confident that the 'true' value for the charge is somewhere in the accepted value's range.

Hlud said:
How accurate are true values? How do we determine the accuracy of these values?

Interestingly, Millikan's result is not in agreement with today's accepted value. If memory serves me correctly I believe he used a value for the viscosity of air that was incorrect and this was the source of systematic uncertainty that threw off his results (which was not noticed until some time after the experiment, and others like it, were performed). So accepted values are determined by comparing careful experiments.

Another example you might be interested in is that of the universal gravitation constant, G. This constant has been notoriously difficult to pin down. Even recent measurements have been significantly different - on the order of tenths of a percent (which is much higher than any other professionally measured value). See http://www.npl.washington.edu/eotwash/bigG article
 
Last edited by a moderator:
From a practical point of view I believe you can safely assume that the "true value" in this context refers to the international standards that are maintained by National Measurement Institutes (such as NIST, PTB, NPL etc). They regularly publish values for constants via CODATA, and all measurement devices can (or at least should) trace their calibration to one of those NMIs (sometimes instruments are supplied with a calibration certificate), so when discussing accuracy of an instrument you are really referring to the national standards (unless you are measuring e.g. the speed of light, which is a constant and has an exact value by definition).

Note also that this is -for most measurements- of little practical importance, the various standards at the NMIs can in most cases be measured with a precision that is at least two orders of magnitude (often more) better than what you would find a "consumer" instrument. Hence, the accuracy of a normal instrument will never be limited because of the realization of the standard. The only time this is an issue is if you are working with e.g. optical atomic clocks which are much more precise (about 3 orders of magnitude) than the Cs-137 time standard for the second, but can of course not really be said to be more accurate.
 
  • Like
Likes   Reactions: brainpushups
Thanks for the information guys! Just one more question. Would i be correct in stating that accuracy refers to the systematic/deterministic error of an experiment, while saying that precision refers to the random/nondeterministic error?
 
I think that is a good characterization. The typical analogy is that of the dartboard. If your darts (measurements) are tightly clustered they are precise. If they are on or near the bullseye they are accurate.
 
Accuracy involves the assessment of all uncertainties and precision involves the assessment of uncontrollable fluctuations in measurements. So accuracy and precision are interrelated. It is not rational to make a measurement to a high precision if the result was known to be inaccurate. Also it would be similarly ridiculous to quote a measurement as accurate if the precision was low. Accuracy must involve the precision of the measurement.
 
But yes, it is hard to make estimates about the accuracy when you don't know "ground truth" to compare against. You are left with trying to eliminate any possible error, and then looking at the distribution of the measurements.
 
rumborak said:
But yes, it is hard to make estimates about the accuracy when you don't know "ground truth" to compare against. You are left with trying to eliminate any possible error, and then looking at the distribution of the measurements.

But if you knew the "ground truth" what is the purpose of the measurement?
 
  • #10
Well, calibrating your gear for example.
 
  • #11
rumborak said:
But yes, it is hard to make estimates about the accuracy when you don't know "ground truth" to compare against. You are left with trying to eliminate any possible error, and then looking at the distribution of the measurements.

To extend the dartboard analogy just a little; simply imagine that the board itself is invisible. Indeed, one would trust tightly clustered darts as a better representation of the true value than ones that were spread out more. However, it would be hard to choose one set of tight measurements over another set of tight measurements if there was a significant discrepancy between the two sets without knowing something about the way the measurements were taken (and even then it might not be clear if one or both of the sets of measurements were in error). A nice illustration of how difficult it can be to detect systematic error.
 
  • #12
Case in point, I'm sure the guys who claimed the faster-than-light neutrino had a neatly cluster of measurements above c. Only that the systematic error completely put it in the wrong ballpark.
 
  • #13
My favorite example: Some years ago I bought a digital thermometer. It presented the temperature with two decimals (precision). But when comparing it to a calibrated mercury thermometer, it was about 1 degree wrong (accuracy).

Another example (presented to me at the university in a lecture on measurement theory): In the early 60's an american set a new world in javelin throw. This record was (naturally) measured in feet and inches. When presenting this in a Norwegian newspaper, the journalist had carefully converted it to metric as 77.35721m. The two last digits subdivide a grain of sand, and the mark a javelin makes upon landing is about 30cm long!

Another real-world example: I once designed and implemented a measurement system for measuring the depth of several measurement points below the sea. When I asked the customer what precision he needed, the answer was "better than 1cm". Problem was, on a very calm day, the waves were about 0.5m (peak-to-trough). So, better than 1cm relative to what?
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
6K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
32K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K