Hi, I am having a hard time getting my head around the time slowing down bit of the theory, and I could do with someone explaining it to me. From my understanding, the faster you travel the slower time gets right? but also any frame of reference is valid, so speed is again relative. and direction of travel has nothing to do with the effect right? In that case I cant figure what happens in the following situation. We have 2 Probes A and B, each carries an atomic clock, and their sychronized. Probe A accelarates away from Probe B at .5c (relative to probe B) for 1 year, then returns again at 0.5c arriving exactly 2 years later as measured by its internal clock. Now here is my confusion since any point of reference is valid. From Probe A's reference point, Probe B is the one moving meaning its clock should be slower. From Probe B's reference point, Probe A is the one moving meaning its clock should be slower. Which one is right? if they both started with a time of 0, how many days/seconds would have passed and how can I work it out? This has always baffled me, as it seems a paradox both can't be right. Thanks in advance for explaining!