- #1

Chenkel

- 482

- 109

I've been thinking about 2nd postulate of relativity, it seems that and the Michaelson-Morley experiment seems to imply that there is no ether, but I was thinking about a special situation that doesn't seem to go against that postulate.

I think my question is basic so hopefully it makes sense and I don't make any major blunders.

Suppose there is an initial light source at distance D away from an eye and the light turns on at the start of the problem, and the eye moves with a relative velocity of v towards the light source, how long does it take light to reach the eye?

I was thinking about it and the gap seems to be closing at a rate of ##c + v## So I would expect the light to reach the eye at the time ##\frac {D} {c + v}##

If the eye is moving away from the light source at the start of the problem when it turns on at a distance D I would imagine the gap to be closing at a rate of ##{c - v}## so the time to cover the distance D should be ##\frac {D} {c - v}##

Hopefully I'm not making any major mistakes in my analysis, let me know what you think.

Thanks.