Tapsnap said:
How do we know the speed of light is constant?
Is it possible that we only perceive it as constant?
Being scientists, and not philosophers, we don't worry about the issues of perception so much, but we do, obviously need standards to define how we measure distance and time that serve as operational definitions.
If your primary interest is philosophy, you'll probably need to find another forum because we no longer discuss philosophy on PF. The reasons for this were varied, the ultimate decision was maybe by the PF owner, though he was certainly guided by comments from the PF staff, other PF posters, and experience. Some of the issues that were noted with philosophical discussions on PF were endless arguments and a lack of academic rigor.
But we can talk about standards and operational meanings / definitions that we use for scientific measurements, and an overview of their history. Basically, we use rulers to measure distance, and clocks to measure time, so the question becomes how to create standard rulers and standard clocks, since speed is just distance divided by time.
The story for the standards behind rulers starts with the idea of the meter, initially conceived as being some suitable fraction of 1 degree of longitude, (one 10,000,000 of one-half of a meridian). It was realized this was not the most precise possible defintion, so the standard for distance was instead formalized and realized by the creation of the Metric system. The meter was based on a carefully created standard platinum bar which served as the primary reference, plus a number of working copies. The primary reference bar was carefully preserved, so in practice the copies were used to calibrate other rulers which were then in turn used for routine measurements, with the primary standard bar jealously guarded. The goal, though, was, that all distance measurements should be traceable back to the primary standard, the carefully maintained platinum bar. For more detail, you might start with wiki,
https://en.wikipedia.org/wiki/History_of_the_metre, and/or a more reliable textbook or history book if you can find one.
Onto the story of time standards, the story of the second. The first definition of the second was conceived as some fraction of a day. The length of the day varied with season due to the Earth's orbit not being circular, so the formal definition of the second involves the second as a fraction of the "mean solar day". As timekeeping became more accurate, it was realized that the Earth's rotation was slowly changing, so that the definition of the second was changing with time. With the advent of accurate and reliable atomic clocks, better alternatives became available. Thus the standard for the second was changed to one based on atomic clocks, rather than astronomy.
Given a long history of the speed of light being measured as being constant, and the difficulties in maintaining the standard platinum bar (just imagine someone dropping it), the definition of the meter also evolved. There was already a great confidence in the stability of the speed of light, so the meter was redefined as the distance light traveled in a certain amount of time, (
1/299 792 458 of a second).
You can verify this for yourself if you look up the definition of the SI second and the SI meter, for instance at the national institute of standards website,
http://physics.nist.gov/cuu/Units/current.html.
The part that may interest you the most is how the decision historically was made , who , how, what, and where are the decisions as to our standards of measurement is made. The organization responsible is called "The General Conference on Weights and Measures", with initials CGMP (the french initials, the historical roots go back to the french revolution and the creation of the metric system).
A point I'd like to emphasize - with the current operational defintions of time and distance, the speed of light is DEFINED as a constant, and no longer needs to be "measured" - because the meter is defined as the distance light travels in a certain amount of time, it's no longer defined by a platinum standard bar. The speed of light used to be measured, when the distance was defined differently than it is now. So when we talk about "measuring the speed of light", it's a bit of an anachronism. I usually approach such questions by assuming that the poster asking them is using the "old" notion of distance based on standardized platinum rulers, rather than get into all the historical details I did in this post.
The lesson that I hope comes across is that the scientific community is pretty confident about the speed of light being constant, given that it has decided through the formal committees created for the purpose of pondering such issues that the most expedient and precise currently known way to measure distances is to measure the time it takes light to travel.