# B The standard of meter

#### RaduAndrei

In the early days, the meter was not defined in terms of the speed of light and thus, having a defined meter and a defined second, measuring the speed of light gave an uncertainty of 1 m/s.

Then the meter was defined in terms of the speed of light and this had the effect of giving the speed of light the exact value of 299792458 m/s.

SO. In the early days, having a defined meter and a defined second, c = 299,792,458 +/- 1 m/s.

Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?

Related Other Physics Topics News on Phys.org

#### Dale

Mentor
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
Yes. Although your stated uncertainty in the speed of light is not correct.

#### Nugatory

Mentor
In the early days, the meter was not defined in terms of the speed of light and thus, having a defined meter and a defined second, measuring the speed of light gave an uncertainty of 1 m/s.
No. The uncertainty in the measured speed of light depended on the accuracy of the instruments we used to make the measurement. That accuracy is unrelated to the units we use to report the results so there's no reason the uncertainty should come out to 1 m/sec just because we're using seconds and meters.
Now, having a defined speed of light and a defined second, must give an uncertainty when measuring the meter. Right?
We didn't have to measure the meter before and we don't now. We did and still do have to know how to answer the question "How long is a meter? I want to make a meter stick. What length should it be and how do I know it's exactly one meter long (plus or minus the errors in my manufacturing process) when I'm done?"

We can measure time very precisely, and the cesium atoms we use to define the second are very predictable, so defining the meter in terms of the second and a constant speed of light gives us a very precise answer to that question. The measurements of time aren't perfect so there's still some ambiguity about how long a meter is, but it's much less that with the older definitions.

• Dale

#### RaduAndrei

Yes. Although your stated uncertainty in the speed of light is not correct.
The stated uncertainty of 1 m/s comes from a physics textbook by Giancoli:

"In 1983 the meter was again redefined, this time in terms of the
speed of light (whose best measured value in terms of the older definition of the
meter was with an uncertainty of 1 m/s). The new definition
reads: “The meter is the length of path traveled by light in vacuum during a
time interval of of a second.
The new definition of the meter has the effect of giving the speed of light the exact value of
299,792,458 ms."

#### RaduAndrei

No. The uncertainty in the measured speed of light depended on the accuracy of the instruments we used to make the measurement. That accuracy is unrelated to the units we use to report the results so there's no reason the uncertainty should come out to 1 m/sec just because we're using seconds and meters.
Yes. The uncertainty depends on the accuracy of the instruments. But these instruments use some defined standards for what a meter is (two engravings on a rod in the early days) and what a second is. So the uncertainty depends, but indirectly. It must depend, come on. Because, after all, you say to the instruments: this is a meter and this is a second. But that meter and second might not be exactly the same as the defined ones.

SO I was not saying the the accuracy is not related to the units, but to the standard units

We didn't have to measure the meter before and we don't now. We did and still do have to know how to answer the question "How long is a meter? I want to make a meter stick. What length should it be and how do I know it's exactly one meter long (plus or minus the errors in my manufacturing process) when I'm done?"

We can measure time very precisely, and the cesium atoms we use to define the second are very predictable, so defining the meter in terms of the second and a constant speed of light gives us a very precise answer to that question. The measurements of time aren't perfect so there's still some ambiguity about how long a meter is, but it's much less that with the older definitions.
We don't have to measure the meter? Then how do you know how long a meter is? What if is the distance from here to the moon? Of course we need to know how long a meter is.
But let's suppose we don't need to measure it. But what if I still want to measure it? Will I have an uncertainty?
It seems obvious to have it.

c=d/t. So three quantities. And I want to measure c. So that leaves me with two quantities (d and t) that have uncertainties associated with them when I use them to measure the speed of light. If however I say that the speed of light is THIS and define the meter based on it and the second then I must have another two quantities with associated uncertainties which are the second and the meter. You can't escape this.

Last edited:

#### Nugatory

Mentor
Y
We don't have to measure the meter? Then how do you know how long a meter is?
We don't need to, because we already defined the meter to be a specific distance, namely how far light travel in a given time. We do sometimes have to perform measurements on an arbitrary object that claims to be one meter long to see if it really is, but that's measuring the object and it's irrelevant to that measurement what units I use.

What if is the distance from here to the moon.
That doesn't require measuring the meter, it requires measuring the time it takes for a radar signal to travel to the moon and back - I divide that time by two, then divide by the (arbitrary! It's a convention that we chose because it was convenient) constant 299792458. That gives us the distance to the moon in meters because of the way that we defined (arbitrarily!) the meter. The uncertainty in that measurement is determined solely by the uncertainty in my timekeeping device, and it is much less than the uncertainty that obtained when we used the older (but equally arbitrary) definition of the meter as the length between two marks on a meta bar in a vault somewhere.

#### RaduAndrei

We don't need to, because we already defined the meter to be a specific distance, namely how far light travel in a given time. We do sometimes have to perform measurements on an arbitrary object that claims to be one meter long to see if it really is, but that's measuring the object and it's irrelevant to that measurement what units I use.

That doesn't require measuring the meter, it requires measuring the time it takes for a radar signal to travel to the moon and back - I divide that time by two, then divide by the (arbitrary! It's a convention that we chose because it was convenient) constant 299792458. That gives us the distance to the moon in meters because of the way that we defined (arbitrarily!) the meter. The uncertainty in that measurement is determined solely by the uncertainty in my timekeeping device, and it is much less than the uncertainty that obtained when we used the older (but equally arbitrary) definition of the meter as the length between two marks on a meta bar in a vault somewhere.
But let's suppose I want to measure the meter even though we don't need to. For that I will just need to measure the distance some light covers in 1/299792458 seconds. So I will surely have an uncertainty given by how I measure the distance and what value I tell to the instrument that a second actually is.

#### RaduAndrei

In the end what I am trying to say is this.

In the early days, you had a defined meter and a defined second. The meter was THIS long and the second was THIS long. And we wanted to measure the speed of light. But for that we needed to tell the instrument how long a meter and how long a second are. So bam. Uncertainties right there. You could not take the rod with the engravings and showed it to the instruments. So measuring the speed of light gave an uncertainty.

Then the meter was redefined in terms of the speed of light. So the speed of light was THIS and the second was THIS. Now we do not need to measure the speed of light because it is defined as THIS. Now we want to measure the meter because its definition is in terms of the speed of light. So the roles have just changed.

#### Nugatory

Mentor
But let's suppose I want to measure the meter even though we don't need to. For that I will just need to measure the distance some light covers in 1/299792458 seconds. So I will surely have an uncertainty given by how I measure the distance and what value I tell to the instrument that a second actually is.
What you are describing isn't "measuring the meter", it is "calibrating your meter stick".

Here's how you'd do it principle (in principle! In practice the technique that I describe would be lucky to produce results that are repeatable to one part in one hundred, as opposed to the one part in ten billion that we get with good instruments these days): You would set up a light source at one end of your meter stick; set up a photodetector at the other end, and set up a cesium clock somewhere. Push a button and the light flashes and the cesium clock starts counting; when the photodetector triggers it stops the clock. Does the clock read exactly 299792458? If so, you have a really good meter stick. If not, you have a not-so-good meter stick.

But note that we did not measure the meter. We don't need to measure that because we already know exactly where the two ends of the one-meter interval are: where the light source is, and where the clock reads 299792458 when we the experiment is done (We can move the clock around until we find that exact point if we're curious, but it's not necessary). We've measuring the distance between the light source and the detector; this happens to be the length of the stick when the source and the detector are at the ends of the stick and has nothing in particular to do with the meter.

Now, if you want to make a meter stick, you'd proceed as above, moving the detector around until you found the magic point where it reads 299792458 every time. Then you'd cut the stick at that point and you'll have a meter stick. But you still haven't measured the meter; you've just adjusted the length of your stick to conform to the definition of the meter.

#### RaduAndrei

Ok. Maybe I used the wrong words. Not measuring the meter, but calibrating some stick. When calibrating this stick, we will have some uncertainty, right?
The true value of the meter will be the defined one, while the stick will have some uncertainty.

#### Dale

Mentor
The stated uncertainty of 1 m/s comes from a physics textbook by Giancoli:

"In 1983 the meter was again redefined, this time in terms of the
speed of light (whose best measured value in terms of the older definition of the
meter was with an uncertainty of 1 m/s). The new definition
reads: “The meter is the length of path traveled by light in vacuum during a
time interval of of a second.
The new definition of the meter has the effect of giving the speed of light the exact value of
299,792,458 ms."
Hmm, I have to go back and look, but I thought that there was a team that had gotten it down to 0.2 m/s in the late 70's

#### Cutter Ketch

The speed of light is exactly 3x10^8 m/s, acceleration due to gravity is 10 m/s^2 which is exactly 30 ft/s^2, pi = 3, and cows are spherical. Didn't you guys learn anything in Physic 101?!?!

#### Nugatory

Mentor
The true value of the meter will be the defined one, while the stick will have some uncertainty.
Yes, that's right. And the true value of the speed of light is also the defined one: 299792458 meters per second, exactly, with no uncertainty whatsoever. These definitions are such that the two statements are equivalent.

### Want to reply to this thread?

"The standard of meter"

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving