- #1
Killtech
- 344
- 35
I am trying to get a better understanding about the constancy of the speed of light which is a well-established axiom of current day physics. for the start i want to understand how it is experimentally established and how these results are interpreted. My difficulty here is that this seems to be far less trivial then for other quantities given the definitions of the units/metrics the speed is measured in are themselves strongly dependent on that very quantity which makes for a weird interdependence that creates circles in my mind.
My understanding is that one must at least hypothetically allow the possibility of ##c## not being constant in order to verify or falsify it with an experiment. But doing so implies that the current definition of the SI meter cannot be assumed to be an ‘absolute’ measure of length since it directly depends on the speed of light. Any change to it directly changes the length of the meter. Of course this definition was chosen after the constancy was well established thus in any case one needs to use a different one instead. But here comes my problem: every other definition has a strong interdependence with the speed of light albeit the type of these dependencies vary and is not clear to me. The bigger problem is that I cannot find the right wording to google an answer for myself.
The reason why I don’t think this is a triviality is that take for example the old metric definition via the pre-SI prototype meter bar. It is a solid state and thus in rough terms a finite grid of a constant number of atoms which total size defined the meter. Properties of the grid like distances between vertices are itself mainly determined by the size of atoms it is composed of which in turn depends on the EM interaction between the electron shell and the nucleus. And since ##c## is the propagation speed of that very force it is natural to assume all atomic properties and states will be gravely affected in one way or another and with it most definitions of the two fundamental SI units (since the definitions of the second are also based on atomic states). Apart from the microscopic effects anything changing Bohr’s radius should also shrink or stretch the entire atomic grid proportionally and therefore the meter bar as a whole (or actually anything made out of atoms).
My first naïve approach to get an understanding of the impact on atoms by an altered ##c## was that the quantum mechanical solution for Hydrogen is easily reapplied for c-modified Maxwell equations and results in the Bohr radius scaling inversely to the speed of light – i.e. a lower ##c## weakens the electric field/energy of photos and thus atom sizes grow. However this approach brings a lot of other constants into play that complicate things. For example given the relations of Planck’s quantum to the photon energy I can’t find a reason that would guarantee it remaining constant in this hypothetical circumstance. One could argue that a photons energy should reduce with ##c## under the constraint of constant frequency and if for example it scaled proportionally then atoms would shrink just in a way that both definitions of the meter (current SI and prototype bar) would remain equivalent. In a similar way definitions via wavelengths of specific atomic levels might be compromised in the same way leading to the possibility of all metric definitions (know to me) remaining equivalent in all circumstances and scale with ##c##.
In that case no direct measurement of the speed of light would ever be able to find a different value regardless. Thus if this scenario cannot be ruled out it would require a very different approach to proof or disproof the constancy of ##c##. Or putting it the other way around: if ##c## would actually vary locally how would we experimentally detect it assuming that matter would be affected the same way as described in the scenario above?
My understanding is that one must at least hypothetically allow the possibility of ##c## not being constant in order to verify or falsify it with an experiment. But doing so implies that the current definition of the SI meter cannot be assumed to be an ‘absolute’ measure of length since it directly depends on the speed of light. Any change to it directly changes the length of the meter. Of course this definition was chosen after the constancy was well established thus in any case one needs to use a different one instead. But here comes my problem: every other definition has a strong interdependence with the speed of light albeit the type of these dependencies vary and is not clear to me. The bigger problem is that I cannot find the right wording to google an answer for myself.
The reason why I don’t think this is a triviality is that take for example the old metric definition via the pre-SI prototype meter bar. It is a solid state and thus in rough terms a finite grid of a constant number of atoms which total size defined the meter. Properties of the grid like distances between vertices are itself mainly determined by the size of atoms it is composed of which in turn depends on the EM interaction between the electron shell and the nucleus. And since ##c## is the propagation speed of that very force it is natural to assume all atomic properties and states will be gravely affected in one way or another and with it most definitions of the two fundamental SI units (since the definitions of the second are also based on atomic states). Apart from the microscopic effects anything changing Bohr’s radius should also shrink or stretch the entire atomic grid proportionally and therefore the meter bar as a whole (or actually anything made out of atoms).
My first naïve approach to get an understanding of the impact on atoms by an altered ##c## was that the quantum mechanical solution for Hydrogen is easily reapplied for c-modified Maxwell equations and results in the Bohr radius scaling inversely to the speed of light – i.e. a lower ##c## weakens the electric field/energy of photos and thus atom sizes grow. However this approach brings a lot of other constants into play that complicate things. For example given the relations of Planck’s quantum to the photon energy I can’t find a reason that would guarantee it remaining constant in this hypothetical circumstance. One could argue that a photons energy should reduce with ##c## under the constraint of constant frequency and if for example it scaled proportionally then atoms would shrink just in a way that both definitions of the meter (current SI and prototype bar) would remain equivalent. In a similar way definitions via wavelengths of specific atomic levels might be compromised in the same way leading to the possibility of all metric definitions (know to me) remaining equivalent in all circumstances and scale with ##c##.
In that case no direct measurement of the speed of light would ever be able to find a different value regardless. Thus if this scenario cannot be ruled out it would require a very different approach to proof or disproof the constancy of ##c##. Or putting it the other way around: if ##c## would actually vary locally how would we experimentally detect it assuming that matter would be affected the same way as described in the scenario above?