Is my textbook wrong about absolute uncertainties?

  • Thread starter Thread starter heroslayer99
  • Start date Start date
  • Tags Tags
    Precision Textbook
Click For Summary
SUMMARY

This discussion centers on the definitions and applications of precision, accuracy, uncertainty, and resolution in measurements, particularly in the context of a telescope's performance. The participant expresses confusion regarding the textbook's use of the term "precision" as a quantifiable value of 9.7 x 10^-4, suggesting it may actually refer to resolution. The participant also questions the relationship between absolute uncertainty and resolution, indicating a potential error in the textbook. The conversation highlights the importance of clarity in scientific definitions and the implications for interpreting measurement data.

PREREQUISITES
  • Understanding of measurement concepts: precision, accuracy, uncertainty, and resolution
  • Familiarity with scientific notation and its application in measurements
  • Basic knowledge of systematic and random errors in data collection
  • Experience with digital measurement devices, such as voltmeters
NEXT STEPS
  • Research the ISO definitions of accuracy and precision in scientific measurements
  • Learn about the relationship between resolution and absolute uncertainty in measurement instruments
  • Explore methods for reducing systematic errors through averaging multiple readings
  • Investigate the implications of measurement uncertainty in astronomical observations
USEFUL FOR

Students and professionals in the fields of physics, astronomy, and engineering who require a clear understanding of measurement accuracy and precision, as well as anyone involved in data analysis and interpretation in scientific research.

heroslayer99
Messages
33
Reaction score
6
Homework Statement
The Hipparcos space telescope used stellar parallax with a precision of 9.7 × 10–4 arcseconds to
determine the distance to stars. Estimate the maximum stellar distance in parsecs that could be measured using Hipparcos. Calculate the percentage uncertainty in the calculated value of the distance to Polaris A if the parallax angle is 7.5 × 10–3 arcseconds.
Relevant Equations
d = 1/p
First off, I will set out what I think I know so that any misconceptions of mine can be put right.
Definitions:
Precision: a quality denoting the closeness of agreement between (consistency, low variability of) measured values obtained by repeated measurements
Accuracy: A quality denoting the closeness of agreement between a measured value and the true value
Uncertainty: interval within which the true value can be expected to lie
Resolution: Smallest increment on the instrument
In the case of a single reading, the abs uncertainty is half the resolution of the instrument, in the case of a measurement (the difference between two readings) the abs uncertainty is twice that of a single reading (twice the uncertainty in 1 reading is clearly just the resolution). For digital devices, like a voltmeter, approximate the abs uncertainty as half the resolution (same as a single reading).
What confuses me greatly is that in my problem I am told that the "precision" of the telescope is 9.7 x 10^-4, but from what I already know, precision (at least at the level I am working at, and the level that the textbook is written for), cannot be quantified, so I do not know what that value means, most likely it is the resolution, and the author has made a mistake. I am also confused as to whether the abs uncertainty is the quoted "resolution" or half this value. Finally, if this is the resolution then clearly the stated value for the parallax angle would be a multiple of this, turns out it isn't.
Please help :(
 
Physics news on Phys.org
Why make things difficult (if with a little bit of effort they can be made bloody impossible :wink:) ?

What if the observed parallax is simply ##75 \ \times 10^{-4} \ \ \pm \ 9.7 \ \times 10^{-4}## ? Can you answer the question?

##\ ##
 
heroslayer99 said:
Accuracy: A quality denoting the closeness of agreement between a measured value and the true value
According to https://en.wikipedia.org/wiki/Accuracy_and_precision, ISO defines it that way but others limit it to systematic error, making it independent of precision.
Presumably, resolution errors would be categorised as systematic (though by deliberately adding random errors and taking multiple readings they can be converted to random errors and thereby reduced).
heroslayer99 said:
in the case of a measurement (the difference between two readings)
Why would you describe the difference between two readings a "measurement "?

But, yes, I agree with you, it is sloppy. If precision (random error) is the only issue then you can get greater overall accuracy by averaging multiple readings. I would interpret it as here intending uncertainty from whatever cause.
 

Similar threads

  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
15
Views
3K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 20 ·
Replies
20
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 0 ·
Replies
0
Views
4K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K