- #1

jenny777

- 28

- 0

I used the micrometer in my lab that has a resolution of 100 nm.

so, my measurement looks something like,

0.2345 mm, with an uncertainty of 0.00005 mm.

But I don't want to write, (0.2345 +/- 0.00005)mm in my data table because it just looks a little awkward to have so many zeros inside my table.

Is there a better way of writing the measurement above? (with it's uncertainty)?

Also, I noticed that there are 2 types of error. One is standard error and then the second one being resolution error.

How can I combine the two? so will my resolution error be 50 nm ? I'm subtracting the two measurements to yield delta d, so will my reading error be, sqrt (50^2+50^2)≈71 nm ?

Thank you