# Speed of light problem

1. Jul 17, 2008

### kdrobey

1. The problem statement, all variables and given/known data
The distance between earth and the moon can be determined from the time it takes for a laser beam to travel from earth to a reflector on the moon and back. If the round-trip time can be measured to an accuracy of 0.17 of a nanosecond (1 ns = 10-9 s), what is the corresponding error in the earth-moon distance?

2. Relevant equations
t=v/d

3. The attempt at a solution
I set v=3x10^8 m/s, d=405x10^6 m, which gave me 1.35 seconds, which should be the time for light to travel to the moon. then i converted 1.35 seconds to ns, which is 1.35e9. to get the percent error, i did: 0.17ns/1.35e9ns=1.25x10^-10%. then i multiplied that by 405 x 10^6, and i got an error in distance of .00506, but this was not right. what am i doing wrong?

2. Jul 17, 2008

### LowlyPion

They are looking for error distances. If you can't measure time within .17 nanoseconds, maybe ask yourself how far a beam of light could travel in .17 nanoseconds? Wouldn't that be the uncertantity in whatever distance you do measure?

3. Jul 17, 2008

### Dick

The d you are measuring is the round trip time, twice the distance from the earth to the moon.

4. Jul 17, 2008

### LowlyPion

Sort of.

Since the distance is measured on a round trip, the actual error in the round trip means that the distance to the moon is accurate to within half that distance doesn't it?