- #1
- 3,401
- 3
I'm coming at this from the perspective of something that might be used in a general discussion piece on SR, or maybe the introduction part of a (low level) course.
One concept that's often difficult for newcomers to grasp is the lack of a 'God Frame'. It comes up in several ways, one of which is time dilation, simultaneity, the twin paradox, etc.
I'm wondering if there might be a way to illustrate this with a variation on observations of muon decays.
For example, suppose there is some absolute frame, in the sense that 'time dilation' varies according to your speed with respect to this frame (it's not a requirement that this idea be consistent, or even make much sense when examined too closely; it's just a 'common sense' foil).
Suppose further that this frame is not 'fixed' to the Earth.
A consequence of such a frame would be that the observed half-life of muons would vary, with an ~24 hour period (as well as, perhaps, a ~1 month one, and a ~1 year one). Or, perhaps, would vary depending on whether the muons were moving transversely to the (local, instantaneous) direction that the test equipment was travelling, wrt the absolute frame (again, it doesn't really matter how any such variation might arise, simply that it would, and in a consistent way).
Does anyone know where any such 'muon half-life' experimental results would have been written up? I'm curious as to the size of any 'time dilation absolute frame' effect that has been ruled out by any such experiments - again, not from the POV of any such idea, just expressed as "no such variation, to x ppm, was detected" (or similar).
One 'experiment' that I thought might be good to use for this purpose is the (1990s?) http://en.wikipedia.org/wiki/Large_Electron_Positron" - the data had some odd correlations, which were tracked down to tides in Lake Geneva and power spikes from the operation of the TGV! Trouble is, I'm not quite sure I can draw a line between the sensitivity of the experiment and the detection of any 'time dilation absolute frame' effect. (Pity, it's a nice story, and so would likely be easily remembered by students).
Maybe if I make it a 'relativistic mass absolute frame' effect?
Another angle: relativistic electrons are to be found in all sorts of modern instruments and devices (aside: do ballistic electrons ever get relativistic in solid state devices?). Any 'time dilation absolute frame' effect would show up as secular variations in the performance of these devices. Are there any precision tools or instruments, such that you might find in a commercial setting, which would be sensitive enough to detect this sort of effect, say to 1ppm? Again, I'm looking for examples 'from everyday life', rather than 'in a research lab'.
One concept that's often difficult for newcomers to grasp is the lack of a 'God Frame'. It comes up in several ways, one of which is time dilation, simultaneity, the twin paradox, etc.
I'm wondering if there might be a way to illustrate this with a variation on observations of muon decays.
For example, suppose there is some absolute frame, in the sense that 'time dilation' varies according to your speed with respect to this frame (it's not a requirement that this idea be consistent, or even make much sense when examined too closely; it's just a 'common sense' foil).
Suppose further that this frame is not 'fixed' to the Earth.
A consequence of such a frame would be that the observed half-life of muons would vary, with an ~24 hour period (as well as, perhaps, a ~1 month one, and a ~1 year one). Or, perhaps, would vary depending on whether the muons were moving transversely to the (local, instantaneous) direction that the test equipment was travelling, wrt the absolute frame (again, it doesn't really matter how any such variation might arise, simply that it would, and in a consistent way).
Does anyone know where any such 'muon half-life' experimental results would have been written up? I'm curious as to the size of any 'time dilation absolute frame' effect that has been ruled out by any such experiments - again, not from the POV of any such idea, just expressed as "no such variation, to x ppm, was detected" (or similar).
One 'experiment' that I thought might be good to use for this purpose is the (1990s?) http://en.wikipedia.org/wiki/Large_Electron_Positron" - the data had some odd correlations, which were tracked down to tides in Lake Geneva and power spikes from the operation of the TGV! Trouble is, I'm not quite sure I can draw a line between the sensitivity of the experiment and the detection of any 'time dilation absolute frame' effect. (Pity, it's a nice story, and so would likely be easily remembered by students).
Maybe if I make it a 'relativistic mass absolute frame' effect?
Another angle: relativistic electrons are to be found in all sorts of modern instruments and devices (aside: do ballistic electrons ever get relativistic in solid state devices?). Any 'time dilation absolute frame' effect would show up as secular variations in the performance of these devices. Are there any precision tools or instruments, such that you might find in a commercial setting, which would be sensitive enough to detect this sort of effect, say to 1ppm? Again, I'm looking for examples 'from everyday life', rather than 'in a research lab'.
Last edited by a moderator: