Meter Defined by Speed of Light: Uncertainty of Measurement

In summary, the meter was initially not defined in terms of the speed of light, leading to an uncertainty of 1 m/s when measuring the speed of light. However, with the meter now defined in terms of the speed of light, the exact value of 299792458 m/s is given. The accuracy of the instruments used to measure the speed of light is not related to the units used, but rather to the defined standards for a meter and a second. While we do not need to measure the meter itself, we still need to know its length for practical purposes. Defining the meter in terms of the speed of light and a constant second gives a precise answer to the question of how long a meter is, but there is still
  • #1
RaduAndrei
114
1
In the early days, the meter was not defined in terms of the speed of light and thus, having a defined meter and a defined second, measuring the speed of light gave an uncertainty of 1 m/s.

Then the meter was defined in terms of the speed of light and this had the effect of giving the speed of light the exact value of 299792458 m/s.

SO. In the early days, having a defined meter and a defined second, c = 299,792,458 +/- 1 m/s.

Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
 
Physics news on Phys.org
  • #2
RaduAndrei said:
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
Yes. Although your stated uncertainty in the speed of light is not correct.
 
  • #3
RaduAndrei said:
In the early days, the meter was not defined in terms of the speed of light and thus, having a defined meter and a defined second, measuring the speed of light gave an uncertainty of 1 m/s.
No. The uncertainty in the measured speed of light depended on the accuracy of the instruments we used to make the measurement. That accuracy is unrelated to the units we use to report the results so there's no reason the uncertainty should come out to 1 m/sec just because we're using seconds and meters.
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
We didn't have to measure the meter before and we don't now. We did and still do have to know how to answer the question "How long is a meter? I want to make a meter stick. What length should it be and how do I know it's exactly one meter long (plus or minus the errors in my manufacturing process) when I'm done?"

We can measure time very precisely, and the cesium atoms we use to define the second are very predictable, so defining the meter in terms of the second and a constant speed of light gives us a very precise answer to that question. The measurements of time aren't perfect so there's still some ambiguity about how long a meter is, but it's much less that with the older definitions.
 
  • Like
Likes Dale
  • #4
Dale said:
Yes. Although your stated uncertainty in the speed of light is not correct.

The stated uncertainty of 1 m/s comes from a physics textbook by Giancoli:

"In 1983 the meter was again redefined, this time in terms of the
speed of light (whose best measured value in terms of the older definition of the
meter was with an uncertainty of 1 m/s). The new definition
reads: “The meter is the length of path traveled by light in vacuum during a
time interval of of a second.
The new definition of the meter has the effect of giving the speed of light the exact value of
299,792,458 ms."
 
  • #5
Nugatory said:
No. The uncertainty in the measured speed of light depended on the accuracy of the instruments we used to make the measurement. That accuracy is unrelated to the units we use to report the results so there's no reason the uncertainty should come out to 1 m/sec just because we're using seconds and meters.

Yes. The uncertainty depends on the accuracy of the instruments. But these instruments use some defined standards for what a meter is (two engravings on a rod in the early days) and what a second is. So the uncertainty depends, but indirectly. It must depend, come on. Because, after all, you say to the instruments: this is a meter and this is a second. But that meter and second might not be exactly the same as the defined ones.

SO I was not saying the the accuracy is not related to the units, but to the standard units

Nugatory said:
We didn't have to measure the meter before and we don't now. We did and still do have to know how to answer the question "How long is a meter? I want to make a meter stick. What length should it be and how do I know it's exactly one meter long (plus or minus the errors in my manufacturing process) when I'm done?"

We can measure time very precisely, and the cesium atoms we use to define the second are very predictable, so defining the meter in terms of the second and a constant speed of light gives us a very precise answer to that question. The measurements of time aren't perfect so there's still some ambiguity about how long a meter is, but it's much less that with the older definitions.

We don't have to measure the meter? Then how do you know how long a meter is? What if is the distance from here to the moon? Of course we need to know how long a meter is.
But let's suppose we don't need to measure it. But what if I still want to measure it? Will I have an uncertainty?
It seems obvious to have it.c=d/t. So three quantities. And I want to measure c. So that leaves me with two quantities (d and t) that have uncertainties associated with them when I use them to measure the speed of light. If however I say that the speed of light is THIS and define the meter based on it and the second then I must have another two quantities with associated uncertainties which are the second and the meter. You can't escape this.
 
Last edited:
  • #6
RaduAndrei said:
Y
We don't have to measure the meter? Then how do you know how long a meter is?
We don't need to, because we already defined the meter to be a specific distance, namely how far light travel in a given time. We do sometimes have to perform measurements on an arbitrary object that claims to be one meter long to see if it really is, but that's measuring the object and it's irrelevant to that measurement what units I use.

What if is the distance from here to the moon.
That doesn't require measuring the meter, it requires measuring the time it takes for a radar signal to travel to the moon and back - I divide that time by two, then divide by the (arbitrary! It's a convention that we chose because it was convenient) constant 299792458. That gives us the distance to the moon in meters because of the way that we defined (arbitrarily!) the meter. The uncertainty in that measurement is determined solely by the uncertainty in my timekeeping device, and it is much less than the uncertainty that obtained when we used the older (but equally arbitrary) definition of the meter as the length between two marks on a meta bar in a vault somewhere.
 
  • #7
Nugatory said:
We don't need to, because we already defined the meter to be a specific distance, namely how far light travel in a given time. We do sometimes have to perform measurements on an arbitrary object that claims to be one meter long to see if it really is, but that's measuring the object and it's irrelevant to that measurement what units I use.

That doesn't require measuring the meter, it requires measuring the time it takes for a radar signal to travel to the moon and back - I divide that time by two, then divide by the (arbitrary! It's a convention that we chose because it was convenient) constant 299792458. That gives us the distance to the moon in meters because of the way that we defined (arbitrarily!) the meter. The uncertainty in that measurement is determined solely by the uncertainty in my timekeeping device, and it is much less than the uncertainty that obtained when we used the older (but equally arbitrary) definition of the meter as the length between two marks on a meta bar in a vault somewhere.

But let's suppose I want to measure the meter even though we don't need to. For that I will just need to measure the distance some light covers in 1/299792458 seconds. So I will surely have an uncertainty given by how I measure the distance and what value I tell to the instrument that a second actually is.
 
  • #8
In the end what I am trying to say is this.

In the early days, you had a defined meter and a defined second. The meter was THIS long and the second was THIS long. And we wanted to measure the speed of light. But for that we needed to tell the instrument how long a meter and how long a second are. So bam. Uncertainties right there. You could not take the rod with the engravings and showed it to the instruments. So measuring the speed of light gave an uncertainty.

Then the meter was redefined in terms of the speed of light. So the speed of light was THIS and the second was THIS. Now we do not need to measure the speed of light because it is defined as THIS. Now we want to measure the meter because its definition is in terms of the speed of light. So the roles have just changed.
 
  • #9
RaduAndrei said:
But let's suppose I want to measure the meter even though we don't need to. For that I will just need to measure the distance some light covers in 1/299792458 seconds. So I will surely have an uncertainty given by how I measure the distance and what value I tell to the instrument that a second actually is.
What you are describing isn't "measuring the meter", it is "calibrating your meter stick".

Here's how you'd do it principle (in principle! In practice the technique that I describe would be lucky to produce results that are repeatable to one part in one hundred, as opposed to the one part in ten billion that we get with good instruments these days): You would set up a light source at one end of your meter stick; set up a photodetector at the other end, and set up a cesium clock somewhere. Push a button and the light flashes and the cesium clock starts counting; when the photodetector triggers it stops the clock. Does the clock read exactly 299792458? If so, you have a really good meter stick. If not, you have a not-so-good meter stick.

But note that we did not measure the meter. We don't need to measure that because we already know exactly where the two ends of the one-meter interval are: where the light source is, and where the clock reads 299792458 when we the experiment is done (We can move the clock around until we find that exact point if we're curious, but it's not necessary). We've measuring the distance between the light source and the detector; this happens to be the length of the stick when the source and the detector are at the ends of the stick and has nothing in particular to do with the meter.

Now, if you want to make a meter stick, you'd proceed as above, moving the detector around until you found the magic point where it reads 299792458 every time. Then you'd cut the stick at that point and you'll have a meter stick. But you still haven't measured the meter; you've just adjusted the length of your stick to conform to the definition of the meter.
 
  • #10
Ok. Maybe I used the wrong words. Not measuring the meter, but calibrating some stick. When calibrating this stick, we will have some uncertainty, right?
The true value of the meter will be the defined one, while the stick will have some uncertainty.
 
  • #11
RaduAndrei said:
The stated uncertainty of 1 m/s comes from a physics textbook by Giancoli:

"In 1983 the meter was again redefined, this time in terms of the
speed of light (whose best measured value in terms of the older definition of the
meter was with an uncertainty of 1 m/s). The new definition
reads: “The meter is the length of path traveled by light in vacuum during a
time interval of of a second.
The new definition of the meter has the effect of giving the speed of light the exact value of
299,792,458 ms."
Hmm, I have to go back and look, but I thought that there was a team that had gotten it down to 0.2 m/s in the late 70's
 
  • #12
The speed of light is exactly 3x10^8 m/s, acceleration due to gravity is 10 m/s^2 which is exactly 30 ft/s^2, pi = 3, and cows are spherical. Didn't you guys learn anything in Physic 101??
 
  • #13
RaduAndrei said:
The true value of the meter will be the defined one, while the stick will have some uncertainty.
Yes, that's right. And the true value of the speed of light is also the defined one: 299792458 meters per second, exactly, with no uncertainty whatsoever. These definitions are such that the two statements are equivalent.
 

FAQ: Meter Defined by Speed of Light: Uncertainty of Measurement

What is the definition of a meter based on the speed of light?

The definition of a meter based on the speed of light is that it is the distance that light travels in a vacuum in 1/299,792,458 of a second. This definition was adopted by the General Conference on Weights and Measures in 1983.

Why is there uncertainty in measuring the speed of light?

There is uncertainty in measuring the speed of light because it is a fundamental constant of the universe and cannot be measured with 100% accuracy. Additionally, there are factors such as atmospheric conditions and experimental error that can affect the precision of the measurement.

How is the uncertainty of measurement in the speed of light accounted for in the definition of a meter?

The uncertainty of measurement in the speed of light is accounted for by defining the meter as the distance light travels in a vacuum in 1/299,792,458 of a second, with the understanding that this measurement is not exact. There is a margin of error that is accepted in the measurement of the speed of light, and this is reflected in the definition of the meter.

What is the current accepted value for the speed of light and its uncertainty?

The current accepted value for the speed of light is 299,792,458 meters per second, with an uncertainty of +/- 0.000,000,001 meters per second. This means that the actual value of the speed of light could be slightly higher or lower than the accepted value, within this margin of error.

How does the uncertainty in the speed of light affect other measurements that use the meter as a unit?

The uncertainty in the speed of light affects other measurements that use the meter as a unit by introducing a small margin of error into those measurements. This is especially important in scientific experiments and calculations where precise measurements are necessary. To account for this, scientists often use more precise units, such as nanometers or picometers, for smaller measurements.

Back
Top