# Doubts in special theory of relativity

You keep saying this. It is not true. (And the reference you claimed supported this said nothing of the sort, which is rather annoying).

The speed of light is assumed (actually, defined) to be the same in all directions, not measured to be the same. Indeed, in a world with length contraction and time dilation, it is impossible to measure the one-way speed of light.
With all due respect, light measures c in the stationary frame, so length contraction and time dilation make absolutely no difference to this calculation.

GPS shows light measures c or it would not work. That is why the link is valid.
But,

Allan et al., IEEE Trans. Inst. and Meas., IM-32 no. 2 (1985), pg 118.

They discuss in detail how time and frequency comparisons among the various standards organizations of the world can be performed with an accuracy of about 1 part in 10^14, using GPS satellites.

http://math.ucr.edu/home/baez/physics/Relativity/SR/experiments.html#GPS

Obviously, like MMX, frequency deviations would indicate a measured speed of light that is not c.

pervect
Staff Emeritus
While the speed of light _is_ currently defined to be equal to "c", it wasn't ALWAYS this way.

I'll trim the history a bit, since we have people who are so impatient on this thread that they can't even write out "too long" longhand....

http://physics.nist.gov/cuu/Units/meter.html

In 1889, a new international prototype was made of an alloy of platinum with 10 percent iridium, to within 0.0001, that was to be measured at the melting point of ice. In 1927, the meter was more precisely defined as the distance, at 0°, between the axes of the two central lines marked on the bar of platinum-iridium kept at the BIPM, and declared Prototype of the meter by the 1st CGPM, this bar being subject to standard atmospheric pressure and supported on two cylinders of at least one centimeter diameter, symmetrically placed in the same horizontal plane at a distance of 571 mm from each other.

The 1889 definition of the meter, based upon the artifact international prototype of platinum-iridium, was replaced by the CGPM in 1960 using a definition based upon a wavelength of krypton-86 radiation. This definition was adopted in order to reduce the uncertainty with which the meter may be realized. In turn, to further reduce the uncertainty, in 1983 the CGPM replaced this latter definition by the following definition:

The meter is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second.

Note that the effect of this definition is to fix the speed of light in vacuum at exactly 299 792 458 m·s-1. The original international prototype of the meter, which was sanctioned by the 1st CGPM in 1889, is still kept at the BIPM under the conditions specified in 1889.
Historically, people have measured the speed of light. It would be confusing to argue that they didn't do this - history records that they did. But when they did, the were using a different definition of the meter, one based on a "meter prototype". Of which there were a couple of different variants, the particular variant used can be inferred from the time at which the measurement was made, and a detailed reading of the experimental protocol. Well-run experiments are expected to have callbrations traceable back to the bureau of standards.

On one hand, that the fact that we base our modern definition of the meter on the speed of light inspires, I hope, some confidence in the average reader that it is, in fact, constant.

On the other hand, I think that because people have historically measured the speed of light, it is a sensible concept to talk about. When I hear someone talk about "measuring the speed of light", I simply assume that a non-modern defintiion of the meter is being used, one based on a prototype meter.

It's not really all that important if one is using the 1889 prototype, or the 1927 prototype, for most purposes, I think.

ghwellsjr