I have seen similar threads on here but not one with any detailed answer so I felt I would ask myself. I took a short undergrad module in measurement and uncertainty, intended to prepare for the numerous lab sessions and reports that would follow in the proceeding modules. In that particular module the concept of uncertainty was introduced along with a basic method of calculating the uncertainty from a set of results. Without going into detail the method to calculate the uncertainty essentially relied upon repeated measurements to be taken and the uncertainty derived by some statistical analysis of the results. What never crossed my mind at the time was the question where does the accuracy (and precision) of the instrument used to record the results factor into this estimate? Suppose that I were to make just one measurement using an instrument with a specified accuracy then want to find the uncertainty of the measurement I have just made, how is this acheived? And finally, let's say I have the stated accuracy of the instrument from the manufacturer and the calibration tolerance. What is the relationship between the two and how do they contribute to the uncertainty?