- 3,372
- 465
I wanted to ask something concerning the errors...
Why sometimes the errors are taken to be \sqrt{N} where N is your measured value (such as the number of counts from a detector) and why is it sometimes given as half of your device precision (eg a common ruler's error is 0.5mm)?
In my case I had an electronic timer which was able to measure the time of a wheeler going left or right, with a precision of 10ms .. since it was automatic I think it gets the measured time from electric signals [as it does for the counted events], so I am not sure whether its error is 5ms or the \sqrt{t} of the measured time t.
Why sometimes the errors are taken to be \sqrt{N} where N is your measured value (such as the number of counts from a detector) and why is it sometimes given as half of your device precision (eg a common ruler's error is 0.5mm)?
In my case I had an electronic timer which was able to measure the time of a wheeler going left or right, with a precision of 10ms .. since it was automatic I think it gets the measured time from electric signals [as it does for the counted events], so I am not sure whether its error is 5ms or the \sqrt{t} of the measured time t.