NOTE: This is NOT a homework problem. I created this one myself based on some problems I have seen, with specific numbers used to make calculation clean and easy. Tl/dr version: in a round trip to a star, will the "moving clock" run slower on the way there but faster on the way back due to the finite time required for a light signal to pass between observers? Please read the example and if the conclusion is faulty please point out where and why. Thanks in advance for reading through this. Okay, I was trying to get into the details of a twin paradox example, and when doing it I ran into strange stuff that I haven't seen in a physics class, and I was hoping for some clarification, whether it be due to me doing these calculations wrong, or some misunderstanding on my part, or if I'm right some verification and exposition. So long story short, I've made up an example and did all the calculations at various points. So here we go. Most problems use Bob and Alice as the protagonists, so this one will as well. -------- Bob and Alice start on earth. Bob leaves to a star that is 6 light years away, then returns. He travels at 0.6c the entire trip. (numbers chosen because the calculations are very neat). γ is 1.25 in this case. With γ as 1.25, the end result is that Alice sees 20 years pass and Bob sees 16 years pass, but it's the way we get to that that seems strange to me. *First I should point out that the acceleration shouldn't be an issue: u = at if the initial velocity is zero, so a = u/t. You can get to 0.6c in only 200 days with an acceleration approximately equal to the gravity of earth: 10.4 m/s2 = 0.6c/(200 days). Half that time if you accelerate at 2 g's. So, I'll neglect acceleration because it'll only affect things by a year or two due to the 200 days it takes to get to 0.6c, and the end results will still be counterintuitive. If we accelerated at 1 g, there isn't going to be any significant time dilation effect as far as I can tell. Using a formula from hyperphysics.com, the factor (using one g and a radius of R ~ 10^6 for the formula) gives about a factor of 0.999999, or basically a factor of 1. So, neglecting the acceleration to 0.6c, Bob leaves earth to travel to the star. From Alice's perspective, Bob, who is now in motion, will have the distance contracted by the Lorentz factor. So the distance is 6 cy/1.25 = 4.8 cy. That means from Alice's frame of reference: Time it takes Bob according to his frame to get to star => t = d/r => t = 4.8 cy/0.6c = 8 years. So Alice sees Bob travel for 8 years according to his clock to get to the star. Alice, on the other hand, would measure a time of ten years (because the distance isn't length contracted for Alice, so the time is 6 cy/0.6c = 10 years). HOWEVER, Alice doesn't see this immediately at 10 years on her clock, because it takes the light signal six years to reach her from the time Bob reaches the star. That means by the time Alice sees Bob reach the star, 16 years should have passed. By the time Alice sees Bob reach the star, 16 years have passed for her. This has to mean that according to Alice, Bob's clock is running at half the speed her clock is (it takes Bob 8 years, but it took Alice 16). So far this isn't anything unexpected. For Bob, on the other hand (who is justified in claiming to be at rest), according to his clock 8 years would have passed, but when he looks back, the light he sees from Earth has to be six years old (because again, the distance form Earth to the star is 6 light years). So that means he'd see Alice clock as only measuring 4 years. (10 for the distance to the star minus the six years for light to get to him = 4 years). This is also unexpected, as both observers should be able to claim to see the other's clock moving slower, since both can claim the other twin is the one "really" moving. So Bob sees that only 4 years pass for Alice, which is half the time for him, which is reasonable because he views himself at rest and her moving, so he should see her clock run slowly. This is where it gets weird to me, and assuming the above is legitimate, here is where I'm confused. On the return trip, according to Alice, remember that 16 years have already passed when Bob reaches the star (10 years to get there and six for the light from the star to reach Alice for her to see that Bob has arrived at the star). Since the trip is 10 light years up and 10 light years back according to the distance measured at Earth, that leaves only 4 more years for Bob to make it back, assuming he left as soon as he reached the star (again, neglecting the time it takes to accelerate). That means Alice MUST see Bob's clock advance 8 years in only 4 years according to Alice's clock (it's 8 years on the way back for Bob just like on the way there). But this means suddenly Alice is seeing Bob's clock run twice as fast as hers! Nothing in my text books says anything about moving clocks funning fast. Likewise, the same issue happens with Bob. When Bob reached the star, he had seen that Alice's clock only measured 4 years. Bob has already been in space for 8 years, so he knows it's going to take him 8 more years to reach home, for a total of 16 years. But we know that form Earth's perspective, the trip must take 20 years. That means Bob must see Alice's clock move 16 years in the time it takes Bob's clock to move 8 on the way home (20 years minus the 4 he already saw on her clock). But this means that Bob must see Alice's clock run twice as fast as his! Again, I've never heard of moving clocks running fast. By the time Bob gets back to Earth, both Alice and Bob agree that Alice saw 20 years pass and Bob saw 16 years pass, but the WAY in which this happened seems to not be in accordance with what I've seen in the explanation of the twin paradox. I've always heard the phrase "moving clocks run slowly," but clearly in this case the return journey has to show the "moving" clock as running fast. Any insight into this? One reason I think this could be right: it seems analogous to redshift and blueshift. If a light source is moving away from you, the waves are shifted to the red side of the spectrum, while if the light source is moving toward you, the waves are shifted to the blue side of the spectrum. If you were using crests and troughs of waves with respect to a given point in space as a clock, then a "redshift clock" would measure fewer "ticks" with respect to a test clock than a "blueshift clock" would, if I understand this made up example properly. Which means that just like Bob and Alice above, "time" as measured by a blueshift clock is moving faster than time as measured by a redshift clock. Am I totally off base here? If so, does anyone care to explain to me what's wrong with the Alice/Bob example above? I can't see where I made a mistake, but I know next to nothing on this topic. Thanks!