This #94 from the 2008 GRE: An observer O at rest midway between two sources of light at x = 0 and x = 10 m observes the two sources to flash simultaneously. According to a second observer O′, moving at a constant speed parallel to the x-axis, one source of light flashes 13 ns before the other. Which of the following gives the speed of O′ relative to O ? Answer: .36c By using a lorentz transformation I find the 10m event always happens earlier in the O' frame by gamma*v*(10m)/c^2. However, I find that the actual amount of time between when observer O' receives the two signals is dependent on the location of the observer O' when the signals are emitted, in the spatial interval between the two signal sources. Can anyone corroborate or refute this claim? I recover the correct answer when I assume the observer O' is halfway in between the signals when they are emitted in O. The question's construction seems to imply that there is answer independent of where O' is. Appreciate the help!