1. The problem statement, all variables and given/known data The distance between earth and the moon can be determined from the time it takes for a laser beam to travel from earth to a reflector on the moon and back. If the round-trip time can be measured to an accuracy of 0.17 of a nanosecond (1 ns = 10-9 s), what is the corresponding error in the earth-moon distance? 2. Relevant equations t=v/d 3. The attempt at a solution I set v=3x10^8 m/s, d=405x10^6 m, which gave me 1.35 seconds, which should be the time for light to travel to the moon. then i converted 1.35 seconds to ns, which is 1.35e9. to get the percent error, i did: 0.17ns/1.35e9ns=1.25x10^-10%. then i multiplied that by 405 x 10^6, and i got an error in distance of .00506, but this was not right. what am i doing wrong?