I know that you are all sick of these threads, but.... If you have object A and B close to each other A-B Then B accelerates to some high velocity, for example 3/4 of the speed of light in a matter of few seconds, so extremely good acceleration. Then, at this speed it keeps moving along at 3/4 c (relative to object A) for a long trip, until it reaches 3/4 light years away from object A. Now, object B accelerates back to object A at an extremely quick acceleration and then cruises at a high speed, until it reaches object A when it stops. If clocks were placed on both objects, why would object B show that less time has gone by then object A? I understand why time changes, but why does A seem to have gone through more time if the ONLY difference in the two objects is that object B accelerated a few times, other than that, it could have thought that object A was going away from it. So this goes to my next question, if something is traveling at a constant velocity, does time change more and more as the object moves for a greater distance? I'm asking my first question because when they both have constant velocities, not accelerating whatsoever, both feel stationary.