- #1
ThatDude
- 33
- 0
Homework Statement
So basically, for a lab, I have to make a calculation abiding by the rules of sig figs, and it involves the subtraction of two numbers in scientific notation:
1.00 x10^-3 - 4.35 x10^-5
= 1.00 x10^-3 - 0.0435 x10^-3 ; therefore, we should round the final answer to two decimal places... which equals 9.57 x10^-4
But, when I do it the long way, like 0.00100 - 0.0000435, I get 0.0009565, this means we should round to 5 digits after the decimal point (from 0.00100), therefore the answer should be 9.6 x 10^-4.
So, both methods give me different values, where am I going wrong?