- #1
KipIngram
- 21
- 1
- TL;DR Summary
- I describe a scenario that seems to present a paradox. I understand what the "correct" answer is supposed to be, but one side of the analysis seems to be paradoxical, and I can't see where my error is. Please help!
Ok, I hope someone can help me see how to sort this out.
Alice has a full-frame (no rolling shutter) video camera that records exactly 30 frames per second. It's mounted to a telescope looking far out into space.
Bob is out there in space with a digital clock that reads out to the millisecond level. He is moving at half the speed of light such that when he makes his closest approach to Alice he will be in her telescope's field of view, and it will take him one second (in Alice time) to cross that field of view. His closest approach is far enough away that the line separating him from Alice is very nearly perpendicular to his direction of motion the entire time he's in her field of view.
Ok, so Alice is going to grab 30 images of Bob's clock as he moves through her field of view. The question is simple. What will be the time delta shown on Bob's clock in consecutive images?
If we regard Alice as stationary, then Bob is in motion and time dilation means his "time rate" is running slower than Alice's. Specifically his clock is running at about 86.7% as fast as Alices. Alices frames are 33.3 ms apart by her clock - during that interval Bob's clock should advance 33.3*0.867 = 28.9 ms, so she should get 30 frames that each show values 28.9 ms apart, frame to frame.
Ok, good. I believe this to be the correct answer, and in fact if Bob were also equipped with a camera and scope and Alice also with a clock, Bob would get the same result from imaging Alice's clock.
Here is the problem. Now I want to be Bob, and I want to predict what Alice is going to see. I see Alice in motion at half the speed of light. I know that means her clock runs 86.7% as fast as mine, so it takes longer than 30ms of my time for her camera to advance from frame to frame. In fact, it seems entirely obvious to me that she will capture images from my clock that are 33.3/0.867 = 38.5 ms apart.
So that is my problem. I believe Alice will get images that show 28.9 ms delta, frame to frame. What is wrong with my Bob-side analysis that predicts 38.5?
I think I could have framed this problem with Bob and Alice moving directly toward or away from one another, but I was trying to avoid having to deal with standard Doppler effect issues. My understanding is that the piece of the puzzle I'm focusing on is "relativistic transverse Doppler effect."
I've been pulling my hair over this for about two days now - if someone can straighten it out for me I'll be much obliged.
Alice has a full-frame (no rolling shutter) video camera that records exactly 30 frames per second. It's mounted to a telescope looking far out into space.
Bob is out there in space with a digital clock that reads out to the millisecond level. He is moving at half the speed of light such that when he makes his closest approach to Alice he will be in her telescope's field of view, and it will take him one second (in Alice time) to cross that field of view. His closest approach is far enough away that the line separating him from Alice is very nearly perpendicular to his direction of motion the entire time he's in her field of view.
Ok, so Alice is going to grab 30 images of Bob's clock as he moves through her field of view. The question is simple. What will be the time delta shown on Bob's clock in consecutive images?
If we regard Alice as stationary, then Bob is in motion and time dilation means his "time rate" is running slower than Alice's. Specifically his clock is running at about 86.7% as fast as Alices. Alices frames are 33.3 ms apart by her clock - during that interval Bob's clock should advance 33.3*0.867 = 28.9 ms, so she should get 30 frames that each show values 28.9 ms apart, frame to frame.
Ok, good. I believe this to be the correct answer, and in fact if Bob were also equipped with a camera and scope and Alice also with a clock, Bob would get the same result from imaging Alice's clock.
Here is the problem. Now I want to be Bob, and I want to predict what Alice is going to see. I see Alice in motion at half the speed of light. I know that means her clock runs 86.7% as fast as mine, so it takes longer than 30ms of my time for her camera to advance from frame to frame. In fact, it seems entirely obvious to me that she will capture images from my clock that are 33.3/0.867 = 38.5 ms apart.
So that is my problem. I believe Alice will get images that show 28.9 ms delta, frame to frame. What is wrong with my Bob-side analysis that predicts 38.5?
I think I could have framed this problem with Bob and Alice moving directly toward or away from one another, but I was trying to avoid having to deal with standard Doppler effect issues. My understanding is that the piece of the puzzle I'm focusing on is "relativistic transverse Doppler effect."
I've been pulling my hair over this for about two days now - if someone can straighten it out for me I'll be much obliged.