1. The problem statement, all variables and given/known data In a frame of reference A lights are on the x axis at x = D and x = -D, where D = 0.6 x109. They flash simultaneously at t = 0. There's also a frame of reference A' moving at v = 0.8c. i) Where and when do the flashes happen in A'? ii) Therefore when would observers at the origins of A and A' see the light? 2. Relevant equations 3. The attempt at a solution i) Well, I have to add D to the location in x to find the location in x'. Then I have to add something else. But I'm pretty confused about what to add. As far as an observer in A' is concerned, I think the light has to travel x' = D + vt'? But I can't use that because I don't know t' or x', so that's probably wrong... In A, both flashes happen at t = 0.6 x109 / 3x108 = 2 seconds. So I have t' as well: t' = γt where γ is the Lorentz factor, so maybe I do have t' actually, I think it's 10/3. Essentially, my question for the first part is whether x' = D + vt' is right. And then the problem is that haven't I already worked out when the observer at A sees the light? It's at t =2. And for the observer at the origin of A', at t = 10/3?