- #1
legends784
Example:
Say I want to calculate the evaporation rate of water and so I record the mass of some amount of water every 30 seconds for 5 minutes. The uncertainty in the scale is inherently .0001g and so that would be the uncertainty in the mass of any individual measurement, but how would I calculate the uncertainty for what I am actually looking for which is a rate. I.e by finding the difference between each concurrent measurement, dividing it by 30seconds, then adding up each of the individual rates in g/s and calculating an average rate in mg/min.
I can't physically wrap my head around how the error transfers between each of these stages, especially because most of them involve dividing/multiplying by absolute numbers like 30seconds or 1000mg/g
All help is appreciated,
Alex
Say I want to calculate the evaporation rate of water and so I record the mass of some amount of water every 30 seconds for 5 minutes. The uncertainty in the scale is inherently .0001g and so that would be the uncertainty in the mass of any individual measurement, but how would I calculate the uncertainty for what I am actually looking for which is a rate. I.e by finding the difference between each concurrent measurement, dividing it by 30seconds, then adding up each of the individual rates in g/s and calculating an average rate in mg/min.
I can't physically wrap my head around how the error transfers between each of these stages, especially because most of them involve dividing/multiplying by absolute numbers like 30seconds or 1000mg/g
All help is appreciated,
Alex