How do we know the speed of light is constant?

In summary: As for time, we use clocks to measure time, and in turn we use the second as our standard unit of time. The second was originally defined as the time it took for one/299,792,458 of the Earth's rotation. This was chosen because it was very close to the time it took for light to travel from the sun to Earth. The second was defined as the time it took for the Earth to go from the equator to the North Pole. To summarize, we use rulers and clocks to measure distance and time, and we use the meter and second as our standards for those measurements.
  • #1
Tapsnap
9
0
How do we know the speed of light is constant?
Is it possible that we only perceive it as constant?
 
Physics news on Phys.org
  • #2
It's not the speed at which light travels, it's the definition of the velocity of any massless object through space. It's the relation between space and time.
 
  • #3
You need to study Maxwell's equationz and the Einstein field equations to understand the need for a constant speed of light.
 
  • #4
Because we have at least half a century of precise measurements that show that it is.
 
  • Like
Likes vanhees71
  • #5
Tapsnap said:
How do we know the speed of light is constant?
Is it possible that we only perceive it as constant?

Strictly speaking, it's not something that we "know" - it's a postulate, and is explicitly described as such in Einstein's 1905 paper introducing special relativity.

However, it is a very plausible postulate.

First, we have over a century of increasingly accurate results (Mad Scientist said a "half century"; he's understating the case) some of which are described in the sticky thread at the top of this forum, suggesting that nature really does behave that way.

Second, the speed of light in a vacuum can be calculated from the laws of electricity and magnetism (Maxwell, 1861) so any non-constancy in that speed would imply a corresponding non-constancy in the laws of E&M (for example, electromagnetism on the surface of the Earth would behave differently at noon and at midnight, and in June and December, as the Earth is moving in different directions at different speeds).
 
Last edited:
  • Like
Likes loislane
  • #6
In any case, I suggest that you review some of the other threads on this topic. You'll likely find that your followup questions and objections have already been discussed. If after that you still have questions... ask away.
 
  • #7
Nugatory said:
Mad Scientist said a "half century"; he's understating the case
I did say 'at least'!
 
  • #8
Mad scientist said:
I did say 'at least'!
:smile: :smile:
 
  • #9
Nugatory said:
Strictly speaking, it's not something that we "know" - it's a postulate, and is explicitly described as such in Einstein's 1905 paper introducing special relativity.

However, it is a very plausible postulate.

First, we have over a century of increasingly accurate results (Mad Scientist said a "half century"; he's understating the case) some of which are described in the sticky thread at the top of this forum, suggesting that nature really does behave that way.

Second, the speed of light in a vacuum can be calculated from the laws of electricity and magnetism (Maxwell, 1861) so any non-constancy in that speed would imply a corresponding non-constancy in the laws of E&M (for example, electromagnetism on the surface of the Earth would behave differently at noon and at midnight, and in June and December, as the Earth is moving in different directions at different speeds).
Just a quick note, nowadays we know the laws of electricity and magnetism are not exactly Maxwell's, they are a good approximation to a quantum field theiory(QED). But of course Einstein had no way to foresee this in 1905.
 
  • #10
Tapsnap said:
How do we know the speed of light is constant?
Is it possible that we only perceive it as constant?

Being scientists, and not philosophers, we don't worry about the issues of perception so much, but we do, obviously need standards to define how we measure distance and time that serve as operational definitions.

If your primary interest is philosophy, you'll probably need to find another forum because we no longer discuss philosophy on PF. The reasons for this were varied, the ultimate decision was maybe by the PF owner, though he was certainly guided by comments from the PF staff, other PF posters, and experience. Some of the issues that were noted with philosophical discussions on PF were endless arguments and a lack of academic rigor.

But we can talk about standards and operational meanings / definitions that we use for scientific measurements, and an overview of their history. Basically, we use rulers to measure distance, and clocks to measure time, so the question becomes how to create standard rulers and standard clocks, since speed is just distance divided by time.

The story for the standards behind rulers starts with the idea of the meter, initially conceived as being some suitable fraction of 1 degree of longitude, (one 10,000,000 of one-half of a meridian). It was realized this was not the most precise possible defintion, so the standard for distance was instead formalized and realized by the creation of the Metric system. The meter was based on a carefully created standard platinum bar which served as the primary reference, plus a number of working copies. The primary reference bar was carefully preserved, so in practice the copies were used to calibrate other rulers which were then in turn used for routine measurements, with the primary standard bar jealously guarded. The goal, though, was, that all distance measurements should be traceable back to the primary standard, the carefully maintained platinum bar. For more detail, you might start with wiki, https://en.wikipedia.org/wiki/History_of_the_metre, and/or a more reliable textbook or history book if you can find one.

Onto the story of time standards, the story of the second. The first definition of the second was conceived as some fraction of a day. The length of the day varied with season due to the Earth's orbit not being circular, so the formal definition of the second involves the second as a fraction of the "mean solar day". As timekeeping became more accurate, it was realized that the Earth's rotation was slowly changing, so that the definition of the second was changing with time. With the advent of accurate and reliable atomic clocks, better alternatives became available. Thus the standard for the second was changed to one based on atomic clocks, rather than astronomy.

Given a long history of the speed of light being measured as being constant, and the difficulties in maintaining the standard platinum bar (just imagine someone dropping it), the definition of the meter also evolved. There was already a great confidence in the stability of the speed of light, so the meter was redefined as the distance light traveled in a certain amount of time, (1/299 792 458 of a second).

You can verify this for yourself if you look up the definition of the SI second and the SI meter, for instance at the national institute of standards website, http://physics.nist.gov/cuu/Units/current.html.

The part that may interest you the most is how the decision historically was made , who , how, what, and where are the decisions as to our standards of measurement is made. The organization responsible is called "The General Conference on Weights and Measures", with initials CGMP (the french initials, the historical roots go back to the french revolution and the creation of the metric system).

A point I'd like to emphasize - with the current operational defintions of time and distance, the speed of light is DEFINED as a constant, and no longer needs to be "measured" - because the meter is defined as the distance light travels in a certain amount of time, it's no longer defined by a platinum standard bar. The speed of light used to be measured, when the distance was defined differently than it is now. So when we talk about "measuring the speed of light", it's a bit of an anachronism. I usually approach such questions by assuming that the poster asking them is using the "old" notion of distance based on standardized platinum rulers, rather than get into all the historical details I did in this post.

The lesson that I hope comes across is that the scientific community is pretty confident about the speed of light being constant, given that it has decided through the formal committees created for the purpose of pondering such issues that the most expedient and precise currently known way to measure distances is to measure the time it takes light to travel.
 
  • Like
Likes vanhees71 and Stephanus
  • #11
loislane said:
nowadays we know the laws of electricity and magnetism are not exactly Maxwell's, they are a good approximation to a quantum field theiory(QED).

This is true, but since QED still predicts that the speed of light is the same in all reference frames, it doesn't change anything with regard to the topic of this thread.
 
  • #12
We have a FAQ that addresses this, at least partially: https://www.physicsforums.com/threads/why-is-the-speed-of-light-the-same-in-all-frames-of-reference.534862/
 
Last edited by a moderator:

Related to How do we know the speed of light is constant?

1. How was the speed of light first measured?

The speed of light was first measured in 1676 by Danish astronomer Ole Rømer. He observed the eclipses of Jupiter's moon, Io, and noticed that the time between eclipses was longer when Earth was moving away from Jupiter and shorter when Earth was moving towards Jupiter. This was due to the difference in the speed of light when measured from different points in Earth's orbit.

2. How do we know the speed of light is constant?

The speed of light is considered to be constant because it has been measured and confirmed by numerous experiments and observations. These experiments include the Michelson-Morley experiment, which showed that the speed of light is the same in all directions, and the Fizeau experiment, which measured the speed of light in moving water. Additionally, the principles of special relativity, proposed by Albert Einstein, rely on the constant speed of light.

3. How do we measure the speed of light?

The most common method of measuring the speed of light is using a device called a laser interferometer. This instrument sends a beam of light through a vacuum and measures the time it takes for the light to travel a known distance. Other methods include using mirrors and timing the reflection of light, or using the properties of electromagnetic waves to calculate the speed of light.

4. Has the speed of light always been constant?

The speed of light has been constant throughout human history, as far as we know. However, some theories propose that the speed of light may have been different in the early universe, during the Big Bang. This is still a topic of debate and further research is needed to fully understand the speed of light in the early universe.

5. What would happen if the speed of light was not constant?

If the speed of light was not constant, it would have significant impacts on our understanding of the universe. The principles of special relativity would no longer hold true, and the laws of physics would need to be rewritten. It would also have practical implications, such as affecting our ability to accurately measure distances and time. However, based on current evidence and experiments, it is widely accepted that the speed of light is indeed constant.

Similar threads

  • Special and General Relativity
3
Replies
74
Views
3K
  • Special and General Relativity
Replies
18
Views
1K
  • Special and General Relativity
Replies
33
Views
2K
  • Special and General Relativity
Replies
13
Views
1K
Replies
2
Views
459
  • Special and General Relativity
Replies
9
Views
2K
  • Special and General Relativity
Replies
12
Views
2K
  • Special and General Relativity
Replies
18
Views
1K
  • Special and General Relativity
2
Replies
45
Views
3K
  • Special and General Relativity
3
Replies
76
Views
4K
Back
Top