How do you count Uncertainty Measurement using Precision Error and Bias Error?
1. The problem statement, all variables and given/known data
Find the Uncertainty Measurement using Precision Error and Bias Error?
The Length of line is 5.625 in. Precision Error is 0.13 in. Bias Error is -0.115 in. What is the UNCERTAINTY measurement?
2. Relevant equations
3. The attempt at a solution
I understand what uncertainty measurement is but I don't know how to calculate it using Precision Error and Bias Error. It is my understanding that the Uncertainty in a standard Inches scale should be (1/8) / 2 = 1/16 inch. But its not true in this case............right?