See, I've just started taking this class, and evne though I've already read books by Martin Gardner and Brian Greene, I don't understand any of this stuff... well, not any. But my math seems to be wrong. I have 2 frames, S and S', such that t=t'=0 and x=x'=0. Event A occurs in frame S at tA=0.3 microseconds, xA = 150 m. Frame S' moves at a velocity of +0.65c (where c is 3x10^8 m/s, by our convention) I don't not understand what to do, but when I do the full lorentz transformation calculation, I end up with a negative time for t'A. Am I miss-interpretting the question or the answer? Why?