pervect said:
In order to even measure the speed of light, you need to take along a physical measuring bar, like the one in Paris. The [current] SI definition of the meter defines the speed of light to be a constant, in fact, so you have to use the old, outdated, previous defintion of the meter as a replica of the satandard meter bar to be able to measure the speed of light at all.
yup. we would revert back to the pre-1960 definition of the meter to even meaningfully measure
c to be anything other than 299792458 m/s, and
only then this physical hypothesis that
c is the same for everyone would be falsifiable. so then what would it mean if someone measured it at 299792457 m/s? assuming we don't attribute it to some atoms getting shaved off the meter stick (between the scratch marks) or some other experimental error, what would it mean if, using the pre-1960 meter, people measured
c to be something else? in my opinion, the salient meaning is one or both of two dimensionless parameters had changed. either the number of Planck lengths in the meter (as defined) changed (and if the meter stick is a "good" meter stick and lost or gained no Pt or Ir atoms, then the number of Planck lengths in the size of Pt or Ir atom or maybe the number of Planck lengths per Bohr radius changed) or the number of Planck times per second changed and those would be the meaningful changed parameters.
i'll go out on a little limb here and speculate: in my opinion, there is no physics as to why
c = 299792458 m/s. it is not a physically meaningful question to ask "why is
c = 299792458 m/s ?" the meaningful questions to ask are "why are there 6.187154 x 10
34 Planck lengths in a meter?" and "why are there 1.854861 x 10
43 Planck times in a second?" and the answers would be sort of historical or anthropocentric ones. we chose the meter to be a length about as big as we are and we chose a second to be a duration of time about as short as we would commonly experience different events. for the meter, the question would break down to why are there about 10
25 Planck lengths in the size of atoms (a question for physicists), why there are about 10
5 atoms in the length of a biological cell (a question for microbiologists), and why there are about 10
5 cells in the length of a sentient being like us (a question for some other biologists). if we answer those questions, we have some idea for why the meter is as big (relative to Planck) as it is. doing similarly for the second, from those two answers (and from the defined fact that
c is always 1 Planck length per Planck time), we have the answer to why
c is about 3 x 10
8 m/s which is really just an anthropocentric concern that Nature doesn't give a rat's ass about.
i know that this is more philosophical than physics, but i think the core physical reality from Nature is that, for all of these fundamental interactions (whether I'm waving some electrically charged object around that is perturbing your charged object or if I'm waving some massive object around that is perturbing your massive object or some other interaction) that exist and have effect even across a vacuum (a
true vacuum that has
nothing in it as a medium to mediate any force) that reality conceivable has this choice of whether or not the speed of propagation of these interactions (across space from one location to another) is finite or infinite. the physics is that this speed is finite, not infinite. doesn't matter
what the finite speed is. it is not meaningful that there be different speeds, from the POV of Nature who doesn't give a rat's ass what units we or the aliens on the planet Zog choose to use. whatever that finite speed is, it defines a constraint on the scaling of length and time so that all other speeds are meaningfully measured against this finite speed of propagation of these ostensibly "instantaneous" actions. logically, it makes no sense that
c vary. it's not just light (E&M), but it's the finite speed of any fundamental interaction.
as long as
c is finite, it simply is what it is, a primary quantity to measure (or simply perceive) everything else against. same thing for G and \hbar and 4 \pi \epsilon_0. Nature has not set those to any particular quantity except "
finite". and then from those finite and singular values (that we may as well call unity) everything else in the universe is built and perceived. (again, i personally think it's more natural to normalize 4 \pi G and \epsilon_0 rather than what Planck and the electrostatic CGS units do, so that Gauss's Law would have no funky scaling coefficient in it and the quantities of "flux density" and "field strength" would be the same and maybe we could equate the concepts.)
anyway pervect, i know PF isn't the forum to spout personal theories, but I'm taking a stab at it. see what comes out.