sloane729
- 7
- 0
Homework Statement
This has been bothering me for quite a while. I'm trying to work out how many measurements I will need to make to get my uncertainty under a predetermined value. If say I want the a fractional uncertainty \frac{\delta T}{T} to be equal to or under some value for some timed event, how would I calculate the number of trials I will need to make to get that uncertainty.
Homework Equations
the standard deviation of some quantity to be measured T is
s = \left( \frac{1}{N-1}\sum_i (T_i - \overbar{T})^2 \right)^{1/2}
then
\delta T = u = \frac{s}{\sqrt{N}}
The Attempt at a Solution
Since to calculate the uncertainty \delta T I would need first find the average of all values of T_i then find the standard deviation divided by the square root of the number of trials which is equal to\delta T. But the number of trials is what I need to find but I can't know it without first finding the average value which is not possible because I need the number of trials etc. It seems like a round about problem if I'm not mistaken