- #1
RaduAndrei
- 114
- 1
In the early days, the meter was not defined in terms of the speed of light and thus, having a defined meter and a defined second, measuring the speed of light gave an uncertainty of 1 m/s.
Then the meter was defined in terms of the speed of light and this had the effect of giving the speed of light the exact value of 299792458 m/s.
SO. In the early days, having a defined meter and a defined second, c = 299,792,458 +/- 1 m/s.
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
Then the meter was defined in terms of the speed of light and this had the effect of giving the speed of light the exact value of 299792458 m/s.
SO. In the early days, having a defined meter and a defined second, c = 299,792,458 +/- 1 m/s.
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?