mathnewb99 said:
We have some people trying to report four places to the right of the decimal with inputs that rounded to 2 and 3 decimal places. Can you please articulate why it is impossible to guarantee four decimal precision? I have been unsuccessful in my attempts and am looking for help from someone with a formal math background. Many thanks in advance.
Hi mathnewb99, welcome to MHB!
There are different schools of thought on how many decimals to report.
Mathematically it is correct to report as many decimals as we want since we would assume that the inputs are exact.
In practice we have to take into account that the inputs are not exact but have a measurement and/or rounding error.
It is then common to report as many decimals as is representative for the precision of the final result.
However, we only do that for the final result. Any intermediate result must be reported with a couple more digits to ensure we do not introduce undesired rounding errors in the final result.
So mathematically we can say that 12.3 + 0.456 = 12.756.
In practice it is conventional to assume that the input 12.3 has an error up to 0.05.
In this example the result will also have an error up to 0.05, so it is common to report the result as 12.8, which is rounded to the same number of decimals as the 'worst' input.
That is, unless it is an intermediate result, in which case we would report it as 12.756.