Measuring the Speed of Light: How Did We Do It?

In summary, the meter is defined as the distance light travels in 1/299792458 of a second. The exact distance was measured using various techniques and equipment, and in 1975, it was measured to be 299792458m/s with a precision of 4 parts per billion. However, as technology improved and measurements of the speed of light became more precise, the definition of the meter was changed in 1983 to a more precise value. This redefinition is not circular reasoning, but rather a way to increase the accuracy of the unit. The meter is now defined by giving the speed of light in a vacuum a precise value, and this is just one part of the overall revision of the International System of Units, where
  • #1
h1a8
87
4
TL;DR Summary
To prevent from having a circular definition, What techniques and equipment is used to measure the meter?
I understand that the meter is defined from the speed of light (distance light travels in 1/299792458 of a second). But how did man measure this exact distance to this level of precision? With any apparatus, isn't there an unknown amount of bottleneck somewhere?
 
Physics news on Phys.org
  • #2
Last edited:
  • Like
  • Informative
Likes Abhishek11235, davenn, vanhees71 and 2 others
  • #3
h1a8 said:
Summary:: To prevent from having a circular definition, What techniques and equipment is used to measure the meter?

I understand that the meter is defined from the speed of light (distance light travels in 1/299792458 of a second). But how did man measure this exact distance to this level of precision? With any apparatus, isn't there an unknown amount of bottleneck somewhere?
Definitions don't have measurement error.
 
  • Like
Likes vanhees71 and Vanadium 50
  • #4
russ_watters said:
Definitions don't have measurement error.
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
 
  • #5
BvU said:
Consult https://en.wikipedia.org/wiki/International_System_of_Units for the definitions

For physical constants measurements and fitting procedure there is https://www.nist.gov/pml/fundamental-physical-constants

(full fitting article here)

Google measuring the speed of light -- get this as starting point

And, to come back to the thread title: https://en.wikipedia.org/wiki/History_of_the_metre
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion. The meter was already defined before that.
Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.
 
  • #6
h1a8 said:
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion.

You can't have it both defined and measured. You choose one or the other. In the past the speed of light was measured using the definition of the meter. Ever since 1983 you now measure the length of a meter and define the speed of light.

The meter was already defined before that. Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.

There's nothing circular about it. It's an issue of precision. When measurements of the speed of light became more precise than our ability to measure a length, the definition was changed. What would you have done under these circumstances? Continue to define the meter as the distance between two scratches on a metal bar when it's possible to define it in a more precise manner?
 
  • Like
Likes Pi-is-3, Klystron, Paul Colby and 3 others
  • #7
h1a8 said:
That's complete circular reasoning.
It is not circular reasoning, it is a redefinition.

Suppose an experiment pre-1983 measured the speed of light. Then, post-1983 that same experiment would measure the length of the experimental light path.

No circularity involved, and the advantage of the redefinition is that the accuracy of the resulting new meter was substantially higher than what could be achieved using the previous standard.
 
  • Like
Likes FactChecker and russ_watters
  • #8
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
For a distance like that, I'd probably use the odometer on my car, but I'm not sure I see the relevance of the question. We don't actually need to visually observe measurements. In fact, the human eye is a pretty poor measuring device.
 
  • Like
Likes Dale
  • #9
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
You get a very accurate clock, a laser, and a mirror. Then you start firing laser pulses at the mirror and timing how long they take to return. Move the mirror until the pulse round trip time is 2/1000 s, and then you know that your mirror is one arctec away.

This is the same process modern SI uses. They just picked a rather clunkier number than 1/1000 s, so that the metre defined this way is (to the best accuracy available) the same as the length of the old metal bar. Note that "one metre" therefore refers to two logically distinct concepts (an "old metre" and a "new metre") that happen to be the same length to available precision.
 
  • Like
Likes russ_watters and DrClaude
  • #10
h1a8 said:
That's complete circular reasoning.
It's not reasoning, but a definition.
 
  • Like
Likes FactChecker
  • #11
Since last year the SI got completely revisioned (using of course as many of the older definitions as possible), and now except for the second all basis units are implicitly defined by giving exact values to the fundamental constants of nature, and of course the definitions depend on each other.

Everything starts with the definition of the second, which is the only one that uses a specific material to define it, i.e., a hyperfine transition of Cs:

The second, symbol s, is the SI unit of time. It is defined by taking the fixed numerical value of the caesium frequency ##\Delta \nu_{\text{Cs}}##, the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom, to be 9192631770 when expressed in the unit Hz, which is equal to ##\text{s}^{−1}##.

All the other units are defined by defining precise values to the fundamental constants of nature. Given the definition of the second that's why lengths are defined by giving the speed of light in a vacuum a precise value:

The metre, symbol m, is the SI unit of length. It is defined by taking the fixed numerical value of the speed of light in vacuum c to be 299792458 when expressed in the unit ##\text{m} \text{s}^{-1}##, where the second is defined in terms of the caesium frequency ##\Delta \nu_{\text{Cs}}##.

Then it goes on with the kg:

The kilogram, symbol kg, is the SI unit of mass. It is defined by taking the fixed numerical value of the Planck constant h to be ##6.62607015 \cdot 10^{-34}## when expressed in the unit J s, which is equal to ##\text{kg} \text{m}^2 \text{s}^{-1}##, where the metre and the second are defined in terms of ##c## and ##\Delta \nu_{\text{Cs}}##.

So it goes on for all the rest of the units. There's nothing circular but all this builds a consistent web of definitions of all the base units of the SI and with them all the units defined within it.

See
https://en.wikipedia.org/wiki/2019_redefinition_of_the_SI_base_units
 
  • Like
  • Informative
Likes Pi-is-3, Klystron, hutchphd and 3 others
  • #12
Mister T said:
You can't have it both defined and measured. You choose one or the other. In the past the speed of light was measured using the definition of the meter. Ever since 1983 you now measure the length of a meter and define the speed of light.
There's nothing circular about it. It's an issue of precision. When measurements of the speed of light became more precise than our ability to measure a length, the definition was changed. What would you have done under these circumstances? Continue to define the meter as the distance between two scratches on a metal bar when it's possible to define it in a more precise manner?
Dale said:
It is not circular reasoning, it is a redefinition.

Suppose an experiment pre-1983 measured the speed of light. Then, post-1983 that same experiment would measure the length of the experimental light path.

No circularity involved, and the advantage of the redefinition is that the accuracy of the resulting new meter was substantially higher than what could be achieved using the previous standard.

In illustration, if I remember correctly, the Coulomb and Ampere were alternately defined by an electrochemical-gravimetric and an electromagnetic (force) definition according to what was the most precisely measurable at different times.

Whenever there is a redefinition it is always made so that the new unit is equal to the old one (within the range of uncertainty of the latter) so for many purposes the change is not noticed, otherwise it would make access to all previous knowledge very complicated. E.g. a cubic cm. of water still weighs near to 1 g at 4° C despite redefinitions of length, force and temperature. But that is why although the French Revolutionaries designed their system to be simple, we now have peculiar and not self-evident nor easily memorable numbers in the definitions.
 
Last edited:
  • Like
Likes Dale and vanhees71
  • #13
AFAIK the most severe changes of the redefinition of 2019 are in the electromagnetic units (at the order of ##10^{-9}## relative uncertainty).
 
  • Like
Likes Dale
  • #14
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?

That's sort of what actually happened: the meter was originally defined as "one ten-millionth of the distance from the equator to the North Pole along a great circle", and the second was originally defined as "as 1/86,400 of the mean solar day". Under those (pre-1983) definitions, c_0 was a measurable quantity with measurement uncertainty, most accurately obtained as 299792.4562±0.0011 m/s.

In 1983, both the meter and second were re-defined in terms of c_0: sort of an inversion of prior definitions. c_0 was defined to be *exactly* 299792.458 m / s, with NO measurement uncertainty, and that specific number was chosen so that pre-1983 physical length and time standards (rulers and clocks) remained in very close agreement to post-1983 length and time standards. One additional consequence is that now we have 'leap seconds' to keep our calendars properly synchronized with the sun.

So, to answer your question, the length of 1 meter can be most accurately measured with a clock. I believe the most accurate clocks have a residual uncertainty of about 1 part in 10^18: https://www.sciencedaily.com/releases/2016/02/160210134952.htm

Similar machinations occurred for values of the vacuum permittivty and permeability.
 
  • Like
Likes BvU and vanhees71
  • #15
h1a8 said:
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion. The meter was already defined before that.
Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.

What makes no sense is to define something based on a slab of metal.

We now want to define every standard measurement based on a physical constant. Even the kg is now being redefined so that it ties into a physical constant rather than some piece of substance (which is slowly losing weight) sitting in some place. This will make such scale a more stable and universally defined scale, rather than based on some object. You can even communicate with another being in a different part of the universe by telling them that we measure time based on the frequency of Cs atom, and they'll know how long our time scale will be. That makes our measurement scale universally derivable and not specific to our small, insignificant corner of the world.

Zz.
 
  • Like
Likes Klystron, Ibix, vanhees71 and 1 other person
  • #16
Ibix said:
You get a very accurate clock, a laser, and a mirror. Then you start firing laser pulses at the mirror and timing how long they take to return. Move the mirror until the pulse round trip time is 2/1000 s, and then you know that your mirror is one arctec away.

This is the same process modern SI uses. They just picked a rather clunkier number than 1/1000 s, so that the metre defined this way is (to the best accuracy available) the same as the length of the old metal bar. Note that "one metre" therefore refers to two logically distinct concepts (an "old metre" and a "new metre") that happen to be the same length to available precision.
Ok I see. But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)? Then you have the bottleneck of the machine itself (beam hits the machine before it detects the beam is back).
 
  • #17
Please cite papers that show these "bottlenecks".

the precision of light's time of flight is so precisely known, your GPS and your live depend on it, especially when you fly.

Zz.
 
Last edited:
  • Like
Likes Dale
  • #18
h1a8 said:
But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)?
Are you talking about the reaction times of electronics? I was only describing an idealised process. Actual experiments are typically incredibly complicated bits of engineering to do something that is often, in principle, almost trivial.

Off the top of my head, though, you could use a setup like Fizeau's original experiment to measure the speed of light, which simply compared a spinning cog wheel to its reflection. When the cog and its reflection look identical the round trip travel time is equal to the time to rotate one tooth. Or you could use the trivial stopwatch method, repeating for many different distances. The travel time of light depends on the distance but your electronics' reaction time does not, so you can measure the reaction time and correct for it.

You can track down papers on measuring the speed of light if you want the gory details. Romer's measurement, Fizeau's experiment, and the experimental basis of SR FAQ pinned in the relativity forum here might be a start.
 
  • #19
U.S. NIST has this article which touches (in no great detail) on how interferometry allows the meter standard to be realized.
 
  • Like
Likes Ibix
  • #20
h1a8 said:
Ok I see. But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)? Then you have the bottleneck of the machine itself (beam hits the machine before it detects the beam is back).
There isn’t a mirror “bottleneck”. Electronics do have some measurable reaction time but that is usually known and can usually be accounted for easily.
 
Last edited:
  • #21
ZapperZ said:
Please cite papers that show these "bottlenecks".

the precision of light's time of flight is so precisely known, your GPS and your live depend on it, especially when you fly.

Zz.
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
 
  • #22
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
Then it should be easy to find a source showing that.
 
  • #23
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.

in that case, do you also consider the gravity from alpha Centauri when you calculate all your forces? Just because it is there, doesn't mean that it has any significance in our measurement.

To include something that doesn't show up in our measurement and claim that it is a bottleneck, that is asinine, especially when you don't even have anything to back it up.

Zz.
 
  • Like
Likes davenn
  • #24
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
Do you understand how interferometry works? It's basically a race between two portions of a light beam that were split, reflect off mirrors, and then return and re-combine. Even if there were a significant delay due to the reflections, both beams would experience the same amount of delay, so it would cancel out.

The people who do these things really do know what they're doing. The reason they have to keep updating the definitions of standards is because science and industry needs the more precise standards to do the things they're doing. It's a part of modern life and has been for at least a couple of centuries.
 
  • #25
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
So, out of curiosity I decided to look for myself to see. I couldn’t find any experimental measurement of any possible delay. However, to estimate the size of any possible delay it seemed reasonable to take the skin depth of about 50 nm in the optical range and divide that by c. It works out to less than 200 attoseconds. The world record on shortest measured time interval is 100 attoseconds, so it is not implausible that it could be measured in the near future. However,
since 200 attoseconds is somewhat small compared to the period of optical light (~1600 attoseconds) it may be a while yet before such an effect becomes measurable.

In any case, it seems safe to dismiss the idea of it being a substantial bottleneck. It is too small to have been previously measured, but it might possibly become measurable in the near future.
 
  • #26
Dale said:
So, out of curiosity I decided to look for myself to see. I couldn’t find any experimental measurement of any possible delay. However, to estimate the size of any possible delay it seemed reasonable to take the skin depth of about 50 nm in the optical range and divide that by c. It works out to less than 200 attoseconds. The world record on shortest measured time interval is 100 attoseconds, so it is not implausible that it could be measured in the near future. However,
since 200 attoseconds is somewhat small compared to the period of optical light (~1600 attoseconds) it may be a while yet before such an effect becomes measurable.

In any case, it seems safe to dismiss the idea of it being a substantial bottleneck. It is too small to have been previously measured, but it might possibly become measurable in the near future.
It's impossible to know the bottleneck size unless you can measure the speed of light with great precision without it having to bounce off mirrors.
 
  • #27
ZapperZ said:
in that case, do you also consider the gravity from alpha Centauri when you calculate all your forces? Just because it is there, doesn't mean that it has any significance in our measurement.

To include something that doesn't show up in our measurement and claim that it is a bottleneck, that is asinine, especially when you don't even have anything to back it up.

Zz.
I didn't claim how large the bottleneck is. I just claimed it exists (which it does). It could be insignificant for all we know. Hell, it could be borderline Planck time. The problem is that WE DONT KNOW without measuring light very precisely without it having to bounce off mirrors.
 
  • #28
h1a8 said:
It's possible
You meant it isn't possible ?

All you have to do is vary the position of the reflecting mirror -- with great precision, because you are searching for a few nm (or less) intercept
 
  • #29
BvU said:
You meant it isn't possible ?

All you have to do is vary the position of the reflecting mirror -- with great precision, because you are searching for a few nm (or less) intercept
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?
 
  • #30
h1a8 said:
I just claimed it exists (which it does). It could be insignificant for all we know

you have already been told several times that it is insignificant and is doesn't affect
our use of the current measurements in science and technology

you are making a mountain out of a molehill by pushing your pet idea of something that just isn't relevant

You still haven't provided any links to your so called bottleneck
you have been asked to do that several times as well
 
  • Like
Likes BvU
  • #31
h1a8 said:
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?
Using my simple approach of bouncing a laser off a mirror: The speed of light is ##c##. The distance to the mirror is ##x##. The delay due to interacting with the mirror and or your sensor electronics is ##T##, and does not depend on the location of the mirror. The round trip time is therefore ##t=T+2x/c##. Measure ##t## for a range of ##x## and you will get a straight line with an intercept. The slope of the line is ##2/c## and the intercept is the delay. Job done.

As noted by Dale, ##T## is most likely immeasurably small.
 
  • Informative
  • Like
Likes Dale and pbuk
  • #32
davenn said:
you have already been told several times that it is insignificant and is doesn't affect
our use of the current measurements in science and technology

you are making a mountain out of a molehill by pushing your pet idea of something that just isn't relevant

You still haven't provided any links to your so called bottleneck
you have been asked to do that several times as well
This is clearly wrong. No one here stated that it was insignificant. One person stated it wasn't a bottleneck but later stated he wasn't sure (believes it to be insignificant). Even if one did then you are still wrong (as you stated I been told several times and not once). If someone did state this then where's the proof?
I never claimed the bottleneck was significant or insignificant. I stated that we don't know. If you disagree then kindly provide proof that the bottleneck is insignificant.

Like I said, it's IMPOSSIBLE to know the bottleneck to a decent degree of precision without measuring the speed of light WITHOUT MIRRORS.

Planck time proves a bottleneck (even if insignificant).
 
Last edited:
  • Skeptical
Likes davenn
  • #33
Ibix said:
Using my simple approach of bouncing a laser off a mirror: The speed of light is ##c##. The distance to the mirror is ##x##. The delay due to interacting with the mirror and or your sensor electronics is ##T##, and does not depend on the location of the mirror. The round trip time is therefore ##t=T+2x/c##. Measure ##t## for a range of ##x## and you will get a straight line with an intercept. The slope of the line is ##2/c## and the intercept is the delay. Job done.

As noted by Dale, ##T## is most likely immeasurably small.
The problem is measuring x to a decent degree of precision. We can definitely measure t to an awesome degree of precision, but not distance without using circular reasoning. But for extremely large distances, x may not need to be precise. I have to think about it.
 
  • #34
h1a8 said:
The problem is measuring x to a decent degree of precision.
You don't measure ##x##. That's how you define ##x##. If ##T## is significant you will find that your definition of ##\lim_{t\rightarrow 0}x## is not zero.

Alternatively, set up two mirrors facing each other and inject pulses so that some bounce off the far mirror and return while some bounce off the far mirror, then the near mirror then the far mirror again before being detected. The first lot have one interaction while the second have three, so the flight time for the second lot minus twice the time for the first lot is ##T##.
 
  • #35
h1a8 said:
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?

You could try posting that in the Introductory Physics Homework thread.
 

Similar threads

  • Classical Physics
Replies
14
Views
970
  • Special and General Relativity
Replies
34
Views
1K
Replies
38
Views
2K
Replies
16
Views
746
Replies
17
Views
2K
Replies
6
Views
1K
  • Classical Physics
Replies
1
Views
641
  • Special and General Relativity
2
Replies
45
Views
3K
  • Quantum Physics
2
Replies
38
Views
2K
  • Special and General Relativity
5
Replies
146
Views
7K
Back
Top