Question: In redefining the meter in terms of the speed of light, why did not the delegates to the 1983 General Conference on Weights and Measures simplify matters by defining the speed of light to be 3x10^8 m/s exactly? For that matter, why did they not define it to be 1 m/s exactly? Were both of these possibilities open to them? If so, why did they reject them? Solution (so far): In the words of my text: "The meter is the length of the path traveled by light in a vacuum during a time interval of 1/299,792,458 of a second." (Which is the same as saying c=299,792,458 m/s.) First off, the meter was defined to be one ten-millionth of the distance from the north pole to the equator. For practical reasons it was then defined to be the distance between two fine lines engraved near the ends of a platinum-iridium bar. Next, in the 60's the meter was redefined to be 1,650,763.73 wavelengths of an orange-red light emitted by krpyton-86 atoms. The measurements of the speed of light had become so precise that the reproducibility of the krypton-86 meter itself became the limiting factor. From this, it makes sense to take the speed of light as a defined quantity and then use it to redefine the meter. 299,792,458 m/s was chosen to be the speed of light because it was known so precisely, making it senseless to further complicate things by defining it to be 3x10^8 m/s. Furthermore, it's absurd to define the speed of light to be 1 m/s. This would make the meter the length of the path light that light travels in 1 second, making the meter needlessly enormous in magnitude unless, however, the second was redefined to be a shorter period of time. Then we'd have a huge unit of distance and a tiny unit of time; not really ideal. Correct?