I Measuring the Speed of Light: How Did We Do It?

AI Thread Summary
The speed of light is defined as 299,792,458 m/s, which was established with high precision in 1975, and the meter was redefined in 1983 based on this measurement. This redefinition was necessary because measurements of the speed of light became more accurate than previous methods of measuring length. The discussion highlights the transition from using physical objects, like a metal bar, to defining units based on fundamental constants for greater precision. Critics argue that this creates circular reasoning, but proponents assert it is a logical redefinition to enhance measurement accuracy. Overall, the shift to defining the meter through the speed of light reflects a broader trend in metrology towards stability and universality in measurement standards.
h1a8
Messages
86
Reaction score
4
TL;DR Summary
To prevent from having a circular definition, What techniques and equipment is used to measure the meter?
I understand that the meter is defined from the speed of light (distance light travels in 1/299792458 of a second). But how did man measure this exact distance to this level of precision? With any apparatus, isn't there an unknown amount of bottleneck somewhere?
 
Physics news on Phys.org
Last edited:
  • Like
  • Informative
Likes Abhishek11235, davenn, vanhees71 and 2 others
h1a8 said:
Summary:: To prevent from having a circular definition, What techniques and equipment is used to measure the meter?

I understand that the meter is defined from the speed of light (distance light travels in 1/299792458 of a second). But how did man measure this exact distance to this level of precision? With any apparatus, isn't there an unknown amount of bottleneck somewhere?
Definitions don't have measurement error.
 
  • Like
Likes vanhees71 and Vanadium 50
russ_watters said:
Definitions don't have measurement error.
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
 
BvU said:
Consult https://en.wikipedia.org/wiki/International_System_of_Units for the definitions

For physical constants measurements and fitting procedure there is https://www.nist.gov/pml/fundamental-physical-constants

(full fitting article here)

Google measuring the speed of light -- get this as starting point

And, to come back to the thread title: https://en.wikipedia.org/wiki/History_of_the_metre
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion. The meter was already defined before that.
Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.
 
h1a8 said:
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion.

You can't have it both defined and measured. You choose one or the other. In the past the speed of light was measured using the definition of the meter. Ever since 1983 you now measure the length of a meter and define the speed of light.

The meter was already defined before that. Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.

There's nothing circular about it. It's an issue of precision. When measurements of the speed of light became more precise than our ability to measure a length, the definition was changed. What would you have done under these circumstances? Continue to define the meter as the distance between two scratches on a metal bar when it's possible to define it in a more precise manner?
 
  • Like
Likes Pi-is-3, Klystron, Paul Colby and 3 others
h1a8 said:
That's complete circular reasoning.
It is not circular reasoning, it is a redefinition.

Suppose an experiment pre-1983 measured the speed of light. Then, post-1983 that same experiment would measure the length of the experimental light path.

No circularity involved, and the advantage of the redefinition is that the accuracy of the resulting new meter was substantially higher than what could be achieved using the previous standard.
 
  • Like
Likes FactChecker and russ_watters
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
For a distance like that, I'd probably use the odometer on my car, but I'm not sure I see the relevance of the question. We don't actually need to visually observe measurements. In fact, the human eye is a pretty poor measuring device.
 
  • Like
Likes Dale
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?
You get a very accurate clock, a laser, and a mirror. Then you start firing laser pulses at the mirror and timing how long they take to return. Move the mirror until the pulse round trip time is 2/1000 s, and then you know that your mirror is one arctec away.

This is the same process modern SI uses. They just picked a rather clunkier number than 1/1000 s, so that the metre defined this way is (to the best accuracy available) the same as the length of the old metal bar. Note that "one metre" therefore refers to two logically distinct concepts (an "old metre" and a "new metre") that happen to be the same length to available precision.
 
  • Like
Likes russ_watters and DrClaude
  • #10
h1a8 said:
That's complete circular reasoning.
It's not reasoning, but a definition.
 
  • Like
Likes FactChecker
  • #11
Since last year the SI got completely revisioned (using of course as many of the older definitions as possible), and now except for the second all basis units are implicitly defined by giving exact values to the fundamental constants of nature, and of course the definitions depend on each other.

Everything starts with the definition of the second, which is the only one that uses a specific material to define it, i.e., a hyperfine transition of Cs:

The second, symbol s, is the SI unit of time. It is defined by taking the fixed numerical value of the caesium frequency ##\Delta \nu_{\text{Cs}}##, the unperturbed ground-state hyperfine transition frequency of the caesium-133 atom, to be 9192631770 when expressed in the unit Hz, which is equal to ##\text{s}^{−1}##.

All the other units are defined by defining precise values to the fundamental constants of nature. Given the definition of the second that's why lengths are defined by giving the speed of light in a vacuum a precise value:

The metre, symbol m, is the SI unit of length. It is defined by taking the fixed numerical value of the speed of light in vacuum c to be 299792458 when expressed in the unit ##\text{m} \text{s}^{-1}##, where the second is defined in terms of the caesium frequency ##\Delta \nu_{\text{Cs}}##.

Then it goes on with the kg:

The kilogram, symbol kg, is the SI unit of mass. It is defined by taking the fixed numerical value of the Planck constant h to be ##6.62607015 \cdot 10^{-34}## when expressed in the unit J s, which is equal to ##\text{kg} \text{m}^2 \text{s}^{-1}##, where the metre and the second are defined in terms of ##c## and ##\Delta \nu_{\text{Cs}}##.

So it goes on for all the rest of the units. There's nothing circular but all this builds a consistent web of definitions of all the base units of the SI and with them all the units defined within it.

See
https://en.wikipedia.org/wiki/2019_redefinition_of_the_SI_base_units
 
  • Like
  • Informative
Likes Pi-is-3, Klystron, hutchphd and 3 others
  • #12
Mister T said:
You can't have it both defined and measured. You choose one or the other. In the past the speed of light was measured using the definition of the meter. Ever since 1983 you now measure the length of a meter and define the speed of light.
There's nothing circular about it. It's an issue of precision. When measurements of the speed of light became more precise than our ability to measure a length, the definition was changed. What would you have done under these circumstances? Continue to define the meter as the distance between two scratches on a metal bar when it's possible to define it in a more precise manner?
Dale said:
It is not circular reasoning, it is a redefinition.

Suppose an experiment pre-1983 measured the speed of light. Then, post-1983 that same experiment would measure the length of the experimental light path.

No circularity involved, and the advantage of the redefinition is that the accuracy of the resulting new meter was substantially higher than what could be achieved using the previous standard.

In illustration, if I remember correctly, the Coulomb and Ampere were alternately defined by an electrochemical-gravimetric and an electromagnetic (force) definition according to what was the most precisely measurable at different times.

Whenever there is a redefinition it is always made so that the new unit is equal to the old one (within the range of uncertainty of the latter) so for many purposes the change is not noticed, otherwise it would make access to all previous knowledge very complicated. E.g. a cubic cm. of water still weighs near to 1 g at 4° C despite redefinitions of length, force and temperature. But that is why although the French Revolutionaries designed their system to be simple, we now have peculiar and not self-evident nor easily memorable numbers in the definitions.
 
Last edited:
  • Like
Likes Dale and vanhees71
  • #13
AFAIK the most severe changes of the redefinition of 2019 are in the electromagnetic units (at the order of ##10^{-9}## relative uncertainty).
 
  • Like
Likes Dale
  • #14
h1a8 said:
I can define the speed of light as 1000 arctec a second. I can also define the arctec as the length light travels in 1/1000 of a second. But how do I visually quantify that an arctic of distance? Do I measure the speed of light visually?

That's sort of what actually happened: the meter was originally defined as "one ten-millionth of the distance from the equator to the North Pole along a great circle", and the second was originally defined as "as 1/86,400 of the mean solar day". Under those (pre-1983) definitions, c_0 was a measurable quantity with measurement uncertainty, most accurately obtained as 299792.4562±0.0011 m/s.

In 1983, both the meter and second were re-defined in terms of c_0: sort of an inversion of prior definitions. c_0 was defined to be *exactly* 299792.458 m / s, with NO measurement uncertainty, and that specific number was chosen so that pre-1983 physical length and time standards (rulers and clocks) remained in very close agreement to post-1983 length and time standards. One additional consequence is that now we have 'leap seconds' to keep our calendars properly synchronized with the sun.

So, to answer your question, the length of 1 meter can be most accurately measured with a clock. I believe the most accurate clocks have a residual uncertainty of about 1 part in 10^18: https://www.sciencedaily.com/releases/2016/02/160210134952.htm

Similar machinations occurred for values of the vacuum permittivty and permeability.
 
  • Like
Likes BvU and vanhees71
  • #15
h1a8 said:
Basically the speed of light is defined off measurements. In 1975, the speed of light was measured to be 299792458m/s with a precision of 4 parts per billion. The meter was already defined before that.
Then you telling me that the meter was redefined in 1983 from this 299792458 number? That makes no sense whatsoever. That's complete circular reasoning.

What makes no sense is to define something based on a slab of metal.

We now want to define every standard measurement based on a physical constant. Even the kg is now being redefined so that it ties into a physical constant rather than some piece of substance (which is slowly losing weight) sitting in some place. This will make such scale a more stable and universally defined scale, rather than based on some object. You can even communicate with another being in a different part of the universe by telling them that we measure time based on the frequency of Cs atom, and they'll know how long our time scale will be. That makes our measurement scale universally derivable and not specific to our small, insignificant corner of the world.

Zz.
 
  • Like
Likes Klystron, Ibix, vanhees71 and 1 other person
  • #16
Ibix said:
You get a very accurate clock, a laser, and a mirror. Then you start firing laser pulses at the mirror and timing how long they take to return. Move the mirror until the pulse round trip time is 2/1000 s, and then you know that your mirror is one arctec away.

This is the same process modern SI uses. They just picked a rather clunkier number than 1/1000 s, so that the metre defined this way is (to the best accuracy available) the same as the length of the old metal bar. Note that "one metre" therefore refers to two logically distinct concepts (an "old metre" and a "new metre") that happen to be the same length to available precision.
Ok I see. But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)? Then you have the bottleneck of the machine itself (beam hits the machine before it detects the beam is back).
 
  • #17
Please cite papers that show these "bottlenecks".

the precision of light's time of flight is so precisely known, your GPS and your live depend on it, especially when you fly.

Zz.
 
Last edited:
  • Like
Likes Dale
  • #18
h1a8 said:
But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)?
Are you talking about the reaction times of electronics? I was only describing an idealised process. Actual experiments are typically incredibly complicated bits of engineering to do something that is often, in principle, almost trivial.

Off the top of my head, though, you could use a setup like Fizeau's original experiment to measure the speed of light, which simply compared a spinning cog wheel to its reflection. When the cog and its reflection look identical the round trip travel time is equal to the time to rotate one tooth. Or you could use the trivial stopwatch method, repeating for many different distances. The travel time of light depends on the distance but your electronics' reaction time does not, so you can measure the reaction time and correct for it.

You can track down papers on measuring the speed of light if you want the gory details. Romer's measurement, Fizeau's experiment, and the experimental basis of SR FAQ pinned in the relativity forum here might be a start.
 
  • #19
U.S. NIST has this article which touches (in no great detail) on how interferometry allows the meter standard to be realized.
 
  • Like
Likes Ibix
  • #20
h1a8 said:
Ok I see. But how did man measure the meter with that degree of precision when a laser hitting a mirror could have bottleneck time to send the beam back (there is a process before the beam is sent back)? Then you have the bottleneck of the machine itself (beam hits the machine before it detects the beam is back).
There isn’t a mirror “bottleneck”. Electronics do have some measurable reaction time but that is usually known and can usually be accounted for easily.
 
Last edited:
  • #21
ZapperZ said:
Please cite papers that show these "bottlenecks".

the precision of light's time of flight is so precisely known, your GPS and your live depend on it, especially when you fly.

Zz.
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
 
  • #22
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
Then it should be easy to find a source showing that.
 
  • #23
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.

in that case, do you also consider the gravity from alpha Centauri when you calculate all your forces? Just because it is there, doesn't mean that it has any significance in our measurement.

To include something that doesn't show up in our measurement and claim that it is a bottleneck, that is asinine, especially when you don't even have anything to back it up.

Zz.
 
  • Like
Likes davenn
  • #24
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
Do you understand how interferometry works? It's basically a race between two portions of a light beam that were split, reflect off mirrors, and then return and re-combine. Even if there were a significant delay due to the reflections, both beams would experience the same amount of delay, so it would cancel out.

The people who do these things really do know what they're doing. The reason they have to keep updating the definitions of standards is because science and industry needs the more precise standards to do the things they're doing. It's a part of modern life and has been for at least a couple of centuries.
 
  • #25
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
So, out of curiosity I decided to look for myself to see. I couldn’t find any experimental measurement of any possible delay. However, to estimate the size of any possible delay it seemed reasonable to take the skin depth of about 50 nm in the optical range and divide that by c. It works out to less than 200 attoseconds. The world record on shortest measured time interval is 100 attoseconds, so it is not implausible that it could be measured in the near future. However,
since 200 attoseconds is somewhat small compared to the period of optical light (~1600 attoseconds) it may be a while yet before such an effect becomes measurable.

In any case, it seems safe to dismiss the idea of it being a substantial bottleneck. It is too small to have been previously measured, but it might possibly become measurable in the near future.
 
  • #26
Dale said:
So, out of curiosity I decided to look for myself to see. I couldn’t find any experimental measurement of any possible delay. However, to estimate the size of any possible delay it seemed reasonable to take the skin depth of about 50 nm in the optical range and divide that by c. It works out to less than 200 attoseconds. The world record on shortest measured time interval is 100 attoseconds, so it is not implausible that it could be measured in the near future. However,
since 200 attoseconds is somewhat small compared to the period of optical light (~1600 attoseconds) it may be a while yet before such an effect becomes measurable.

In any case, it seems safe to dismiss the idea of it being a substantial bottleneck. It is too small to have been previously measured, but it might possibly become measurable in the near future.
It's impossible to know the bottleneck size unless you can measure the speed of light with great precision without it having to bounce off mirrors.
 
  • #27
ZapperZ said:
in that case, do you also consider the gravity from alpha Centauri when you calculate all your forces? Just because it is there, doesn't mean that it has any significance in our measurement.

To include something that doesn't show up in our measurement and claim that it is a bottleneck, that is asinine, especially when you don't even have anything to back it up.

Zz.
I didn't claim how large the bottleneck is. I just claimed it exists (which it does). It could be insignificant for all we know. Hell, it could be borderline Planck time. The problem is that WE DONT KNOW without measuring light very precisely without it having to bounce off mirrors.
 
  • #28
h1a8 said:
It's possible
You meant it isn't possible ?

All you have to do is vary the position of the reflecting mirror -- with great precision, because you are searching for a few nm (or less) intercept
 
  • #29
BvU said:
You meant it isn't possible ?

All you have to do is vary the position of the reflecting mirror -- with great precision, because you are searching for a few nm (or less) intercept
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?
 
  • #30
h1a8 said:
I just claimed it exists (which it does). It could be insignificant for all we know

you have already been told several times that it is insignificant and is doesn't affect
our use of the current measurements in science and technology

you are making a mountain out of a molehill by pushing your pet idea of something that just isn't relevant

You still haven't provided any links to your so called bottleneck
you have been asked to do that several times as well
 
  • Like
Likes BvU
  • #31
h1a8 said:
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?
Using my simple approach of bouncing a laser off a mirror: The speed of light is ##c##. The distance to the mirror is ##x##. The delay due to interacting with the mirror and or your sensor electronics is ##T##, and does not depend on the location of the mirror. The round trip time is therefore ##t=T+2x/c##. Measure ##t## for a range of ##x## and you will get a straight line with an intercept. The slope of the line is ##2/c## and the intercept is the delay. Job done.

As noted by Dale, ##T## is most likely immeasurably small.
 
  • Informative
  • Like
Likes Dale and pbuk
  • #32
davenn said:
you have already been told several times that it is insignificant and is doesn't affect
our use of the current measurements in science and technology

you are making a mountain out of a molehill by pushing your pet idea of something that just isn't relevant

You still haven't provided any links to your so called bottleneck
you have been asked to do that several times as well
This is clearly wrong. No one here stated that it was insignificant. One person stated it wasn't a bottleneck but later stated he wasn't sure (believes it to be insignificant). Even if one did then you are still wrong (as you stated I been told several times and not once). If someone did state this then where's the proof?
I never claimed the bottleneck was significant or insignificant. I stated that we don't know. If you disagree then kindly provide proof that the bottleneck is insignificant.

Like I said, it's IMPOSSIBLE to know the bottleneck to a decent degree of precision without measuring the speed of light WITHOUT MIRRORS.

Planck time proves a bottleneck (even if insignificant).
 
Last edited:
  • Skeptical
Likes davenn
  • #33
Ibix said:
Using my simple approach of bouncing a laser off a mirror: The speed of light is ##c##. The distance to the mirror is ##x##. The delay due to interacting with the mirror and or your sensor electronics is ##T##, and does not depend on the location of the mirror. The round trip time is therefore ##t=T+2x/c##. Measure ##t## for a range of ##x## and you will get a straight line with an intercept. The slope of the line is ##2/c## and the intercept is the delay. Job done.

As noted by Dale, ##T## is most likely immeasurably small.
The problem is measuring x to a decent degree of precision. We can definitely measure t to an awesome degree of precision, but not distance without using circular reasoning. But for extremely large distances, x may not need to be precise. I have to think about it.
 
  • #34
h1a8 said:
The problem is measuring x to a decent degree of precision.
You don't measure ##x##. That's how you define ##x##. If ##T## is significant you will find that your definition of ##\lim_{t\rightarrow 0}x## is not zero.

Alternatively, set up two mirrors facing each other and inject pulses so that some bounce off the far mirror and return while some bounce off the far mirror, then the near mirror then the far mirror again before being detected. The first lot have one interaction while the second have three, so the flight time for the second lot minus twice the time for the first lot is ##T##.
 
  • #35
h1a8 said:
I don't quite understand this. Let's say the light interacts with the mirror for 1e-10 of a second before it bounces back. How would we detect this by varying the position of the reflecting mirror?

You could try posting that in the Introductory Physics Homework thread.
 
  • #36
h1a8 said:
It's impossible to know the bottleneck size unless you can measure the speed of light with great precision without it having to bounce off mirrors.
It is not true that that is the only way to estimate it. However, we can measure c without mirrors and have done and have found no significant difference.

h1a8 said:
I never claimed the bottleneck was significant or insignificant.
The term “bottleneck” refers to the rate limiting step of a process. So if there even is a delay at all then the fact that it is so small as to be immeasurable indicates that it is not the rate limiting step and therefore not a bottleneck. You cannot have an insignificant bottleneck.
 
Last edited:
  • Like
Likes Marc Rindermann and russ_watters
  • #37
h1a8 said:
No process is instant. Light interacting with matter to bounce the light back make take a process of x of a second. To assume their is no bottleneck is asinine.
What if the electronics are only recording the measurement and not creating the measurement? You should read the linked article on interferometry, because the best measurements are not like measuring distance with a stopwatch. Interferometry is closer to measuring it with a ruler. Signal and processing delay are not a factor.
 
  • Like
Likes Klystron and Dale
  • #38
h1a8 said:
We can definitely measure t to an awesome degree of precision, but not distance without using circular reasoning.

It appears you don't want to listen to what we're telling you. When we measure distances all we are doing is comparing the lengths of different objects. We know a skyscraper is taller than a person, there is nothing circular about that.

Do you really think that you're right and the tens of thousands of metrologists spread all across the globe are wrong?

When you need a new roof on your house and measurements of lengths are used to determine the cost do you reject the contractor's estimates because he's using circular reasoning to bill you?
 
  • #39
russ_watters said:
What if the electronics are only recording the measurement and not creating the measurement? You should read the linked article on interferometry, because the best measurements are not like measuring distance with a stopwatch. Interferometry is closer to measuring it with a ruler. Signal and processing delay are not a factor.
I was referring to the delay of the beam interacting with the mirror. An interferometer already has the distances set. But what if the distances are off (using bouncing light to measure distances accurately is using circular reasoning).

Mister T said:
It appears you don't want to listen to what we're telling you. When we measure distances all we are doing is comparing the lengths of different objects. We know a skyscraper is taller than a person, there is nothing circular about that.

Do you really think that you're right and the tens of thousands of metrologists spread all across the globe are wrong?

When you need a new roof on your house and measurements of lengths are used to determine the cost do you reject the contractor's estimates because he's using circular reasoning to bill you?

I don't understand. The theory was to calculate the delay by varying the distance. But the distance can be off a significant amount. t = T +2/c *x

Dale said:
It is not true that that is the only way to estimate it. However, we can measure c without mirrors and have done and have found no significant difference.

The term “bottleneck” refers to the rate limiting step of a process. So if there even is a delay at all then the fact that it is so small as to be immeasurable indicates that it is not the rate limiting step and therefore not a bottleneck. You cannot have an insignificant bottleneck.

I would like to know what other ways man measures light with the same degree of accuracy without using mirrors. I still don't see a way man determines the delay is insignificant without proof. The linear line method where the intercept is the delay is faulty if x, the distance, is not measured to a sufficient degree of accuracy. And man usually measures distance with great accuracy by bouncing light (which becomes circular).
 
  • #40
h1a8 said:
An interferometer already has the distances set. But what if the distances are off (using bouncing light to measure distances accurately is using circular reasoning).
That isn't true. Did you not read the article? An interferometer isn't measuring against its own length, that would be as pointless as using a hunk of metal alone!

The inteferometer has one stationary and one movable mirror and the signal is tuned to create the interference pattern, enabling precise measurement of the wavelength. The locations of the mirrors at the start isn't important; what matters is how far you move the movable mirror, locating a starting and ending point. Here's another article with more detail:
https://www.renishaw.com/en/interferometry-explained--7854
I still don't see a way man determines the delay is insignificant without proof.
As they say, the proof is in the pudding. If the error was actually unknown and actually much larger than believed, then the things people do that require accurate measurements would not work.
 
Last edited:
  • Like
Likes BvU and PeroK
  • #41
Suppose there is a significant delay at a mirror. It can be determined. It would be a constant delay even if the distances were doubled, tripled, or even increased by orders of magnitude. That constant term could be easily calculated and adjusted for. It would be the constant term in a linear fit of results from different distances to the mirror.
 
  • Like
Likes Dale and russ_watters
  • #42
h1a8 said:
The linear line method where the intercept is the delay is faulty if x, the distance, is not measured to a sufficient degree of accuracy. And man usually measures distance with great accuracy by bouncing light (which becomes circular).
We have been over this already. There is no circularity involved.

Pre-1983 the length of a meter was measured by counting wavelengths of the light from a particular light source, not “bouncing light”. Thus the measurements of the speed of light were non-circular, but the single greatest source of error was the extant standard for length. I.e. the bottleneck was the length standard and all other errors combined (including any mirrors) were about 1/6 of the error of the length standard.

As a result we changed the standard of length to the speed of light. This resulted in an immediate improvement in the precision of length measurements by a factor of 6. It also meant that the speed of light was no longer measured at all but was defined exactly.

So, once again, there is no circularity in the measurement of the speed of light. If you are “bouncing light” to measure distance then the speed of light is known exactly. If you are measuring the speed of light then you are using pre-1983 meters and so you are not “bouncing light“ to measure distance.
 
Last edited:
  • Like
Likes sophiecentaur and russ_watters
  • #43
h1a8 said:
I don't understand.

When the length of an object is measured it's always a comparison to another object's length. There are only three possibilities: Suppose you have two objects, the lengths of which are ##A## and ##B##. Either ##A>B, A=B,## or ##A<B##. Since there is no circularity in that process, your claim that measuring a length is circular is false.
 
  • #44
The whole point about choice of a definition is repeatability. People on the planet Zog should be in a position to build up a measurement system, identical to what's used on Earth without needing to have the King's big toe available everywhere in order to start off.

For a satisfactory system of units, everything has to pivot about quantities that can be reproduced in any Lab, anywhere. So atoms of a well behaved element are great for defining time and that gives you distance. Numbers of similar atomic nuclei (again, something that is chemically stable etc etc) can define your Mass unit. The electronic charge is a great way to start on Electrical units.

We suffer from a history in which none of the above were available when units were first defined but now is now.
 
  • Like
Likes Dale and BvU
  • #45
russ_watters said:
As they say, the proof is in the pudding. If the error was actually unknown and actually much larger than believed, then the things people do that require accurate measurements would not work.
The questioner may not be aware of the very common experimental method of finding and eliminating 'offsets' by repeating an experiment with different known values and subtracting the results. Unknown errors become known errors.
 
  • Like
Likes FactChecker and russ_watters
  • #46
sophiecentaur said:
So atoms of a well behaved element are great for defining time and that gives you distance.
It does give you distance because of course of a reproducible way of measuring the speed of a light beam in a vacuum.
 
  • Like
Likes sophiecentaur
  • #47
If and when we ever get to chat with the Scientists on Planet Zog, it would be a serious ego trip to find they arrived at the same philosophy as us. Otoh, it could be a gobsmacking surprise when we find that it's all based on King Zog's toe!
 
  • #48
There seems to be a bottleneck of understanding here, combined with a fear of mirrors.

A mirror for an EM wave can be made from a flat conductive sheet. The reflection is from the surface layer of conductive atoms. If the wave penetrated more than one atom deep it would suffer multiple internal reflections which would increase energy losses within the mirror.

We can know that the zone of reflection must be spread over a depth of less than λ/4, or we would see destructive interference of reflected light.

An incident magnetic field causes a perpendicular current to flow on a conductive surface. That re-generates a perpendicular magnetic field, now opposite to the incident field. Turning left twice is the same as going back the way you came, i² = –1, reflect on that. The incident and reversed fields cancel into the mirror, so the incident energy must be carried away from the mirror in a reflected wave.

Since the incident and induced fields cancel into the mirror, the time needed to reverse must be very close to zero, or the phases into the mirror would not cancel, and the mirror would be lossy, making an inefficient reflector of energy.

This all suggests that the time delay of a mirror is less than the time needed to travel the ionic radius of a conductive atom. We do not know where the effective reflective surface of a mirror is, until we look near the face of the mirror, at standing waves formed between the incident and reflected rays . But it is the reflective surface we are interested in, so there is no problem, and no delay.
 
  • Like
Likes Klystron, FactChecker, Dale and 1 other person
  • #49
Baluncore said:
ombined with a fear of mirrors

Catoptrophobia ?
 
  • Like
Likes Klystron, Baluncore, sophiecentaur and 1 other person

Similar threads

Back
Top