# Reducing uncertainty

1. Feb 12, 2009

### bocobuff

1. The problem statement, all variables and given/known data
I want to lower the uncertainty of an experiment to 0.012 s from 0.03 s by performing x number of more measurements. I already have the mean, std dev, variance.

I know I need the factor of 0.012/0.03 somehow but I don't know where to apply it.
Any suggestions?

2. Feb 12, 2009

### LowlyPion

Identify the measurement that introduces the most amount of error and explore a way for improving that. I think it does little good to reduce uncertainty of quantities you already know with sufficient precision.

3. Feb 12, 2009

### bocobuff

That's not really what i was asking....
So say I already took N=7 measurements and have the best estimated uncertainty for the mean time, which is 0.03.
Now I want to reduce that number to 0.012 by taking some new number of measurements.
Wouldn't I just use the ratio 0.03/0.012 = 2.5 and then multiply that by 7?
So I would need to take 17.5, or 18, measurements to reduce my uncertainty to 0.012...?

4. Feb 12, 2009

### LowlyPion

If it is something like timing something, then yes you can increase the number of oscillations say linearly as you suggest.

If your original modeling for the error took 200 ms as your stop watch reaction time and you want to achieve a 12 ms error, then figure that 17 oscillations will afford you that precision.

However you can also consider getting a laser sensor that attacks the problem with the notion that the error introduced is smaller to begin with.