Not Understanding Time Dilation - mk 2.

cmb
Messages
1,128
Reaction score
128
Further to the other current thread, I have a different 'don't understand' issue for time dilation. I've thought it over and I might have a slight grasp of 'the answer', but it is still a bit murky for me:

Two space ships of the future pass each other in opposite directions, each traveling at 0.5c relative to some arbitrary point. Each one observes the other to be a spaceship going 0.7c away from them [if my presumption that relativistic speeds add up in quadrature?]. Both therefore see the clock of the other ship running slower than their own.

How can both on-board clocks be running slower than each other?
 
Physics news on Phys.org
Ill take a crazy stab at it this. As this sounds like what if i go back and time and kill grandpa what will happen. As i see it Time Dilation is only relevant to the speed at which you are passing through space time. Not relevant to any other body. Crazy theory of mine, as you move through space-time, space-time applies a force on you the faster you go the greater the force, and the greater this force the faster you will proceed through time. So on your two ships there clocks will be ticking according to there speed through space time, and not according to the speed difference between them. I wish NASA would put a clock into space and give it zero velocity and see what the clock did but this would be pretty hard to do considering we are moving at 580 km/s.
 
As far as I have read, time dilation is an actual issue for ultra-accurate GPS [I think] satellite clocks, and ps corrections need to be fed in.

This is what actually prompted me to think of this in the first place; why does the GPS clock run slower than ground-based ones, when as far as the GPS clock is concerned it is the ground-based clock that's moving!? In this case, I suspect any such effects have more to do with gravity dilation than motional dilation, but I'd value any knowledgeable comments on this.
 
Well if you could find out what the clock says on the voyager spaces crafts said, relative to speed and gravity on the spacecraft and clocks on Earth I bet you would have your answer anyone got a friend at NASA?
 
cmb said:
Further to the other current thread, I have a different 'don't understand' issue for time dilation. I've thought it over and I might have a slight grasp of 'the answer', but it is still a bit murky for me:

Two space ships of the future pass each other in opposite directions, each traveling at 0.5c relative to some arbitrary point. Each one observes the other to be a spaceship going 0.7c away from them [if my presumption that relativistic speeds add up in quadrature?].
Well, actually at (0.5c+ 0.5c)/(1+ (0.5)^2)= 0.8c

Both therefore see the clock of the other ship running slower than their own.

How can both on-board clocks be running slower than each other?
They don't- none of this is "absolute". Each person sees the other person's clock running slower. A third person, at rest relative to your "arbitrary point", would see the two clocks running at the same speed.
 
CDCraig123 said:
Well if you could find out what the clock says on the voyager spaces crafts said, relative to speed and gravity on the spacecraft and clocks on Earth I bet you would have your answer anyone got a friend at NASA?

Clocks on satellites and even fast moving jet planes do run slower than on the earth. That has already been verified.
 
HallsofIvy said:
Well, actually at (0.5c+ 0.5c)/(1+ (0.5)^2)= 0.8c
OK, so that's how the add up. Thanks,


HallsofIvy said:
They don't- none of this is "absolute". Each person sees the other person's clock running slower. A third person, at rest relative to your "arbitrary point", would see the two clocks running at the same speed.
So, er, do the GPS clocks on the satellites see the clocks running slower on earth, or faster? Maybe I read this wrong, but why this, then:

For GPS satellites, GR predicts that the atomic clocks at GPS orbital altitudes will tick faster by about 45,900 ns/day because they are in a weaker gravitational field than atomic clocks on Earth's surface. Special Relativity (SR) predicts that atomic clocks moving at GPS orbital speeds will tick slower by about 7,200 ns/day than stationary ground clocks. Rather than have clocks with such large rate differences, the satellite clocks are reset in rate before launch to compensate for these predicted effects.

http://www.metaresearch.org/cosmology/gps-relativity.asp

The gravity bit I can see, gravity is what it is at anyone point. But the velocity is relative, and if the GPS is running 7,200 ns/day to a ground-based clock watching it because it is moving relative to it, then does the GPS clock 'see' the ground-based clock running faster or slower?
 
Last edited by a moderator:
HallsofIvy said:
Clocks on satellites and even fast moving jet planes do run slower than on the earth. That has already been verified.
Still struggling here. The clock comes back to Earth and we find it has lost a few seconds. But if you traveled with the clock, wouldn't the clock on Earth (that has been moving relative to you) be the one that's lost the time?
 
Are you considering gravitational effects, or just kinematic effects? Both play a role, and near the Earth the gravitational effects often dominate and they are often in opposite directions. However, I don't want to go into GR if you are struggling to understand SR.
 
  • #10
DaleSpam said:
Are you considering gravitational effects, or just kinematic effects? Both play a role, and near the Earth the gravitational effects often dominate and they are often in opposite directions. However, I don't want to go into GR if you are struggling to understand SR.

Please see my previous comment. I'm happy with the gravitational issue - according to the article I link to this constitutes 45,900 ns/day. Disregard that, I see that bit.

So now I am talking about the 7,200 ns/day kinematic effect. Which clock runs slower by 7,200 ns/day, the GPS with a guy sitting next to a ground based one, or the ground-based clock as observed by an astronaut floating along with the GPS?
 
  • #11
cmb said:
So now I am talking about the 7,200 ns/day kinematic effect. Which clock runs slower by 7,200 ns/day, the GPS with a guy sitting next to a ground based one, or the ground-based clock as observed by an astronaut floating along with the GPS?
In which frame? Neglecting the gravitational effects (which can only be done for a very short time), in the ground frame the satellite clock is running slow and in the satellite frame the ground clock is running slow. The satellite frame is not used for GPS navigation in any way, but you certainly could set it up and determine the speed of the ground clocks in that frame.
 
  • #12
DaleSpam said:
In which frame?
I undertand the question, but there again there is something about that question that makes me hesitate. One clock must be sped up relative to the other so that they stay in sync. I'm finding it tough to fathom that you can slow one OR other, whichever, and still end up with a 'consistent' time frame where both match up.
 
  • #13
cmb said:
I undertand the question, but there again there is something about that question that makes me hesitate. One clock must be sped up relative to the other so that they stay in sync. I'm finding it tough to fathom that you can slow one OR other, whichever, and still end up with a 'consistent' time frame where both match up.
Since we are not interested in the gravitational aspect let's not talk about ground and satellite clocks, but just clocks A and B moving inertially in space.

Suppose that A had not just one clock but a whole system of clocks, all synchronized with each other in A's frame. And suppose that B had not just one clock but a whole system of clocks, all synchronized with each other in B's frame.

Now each one says: "My clocks are properly synchronized and running normally, but his clocks are not synchronized and they are running slowly".

Each one also says: "If he wants to make his clock always match the clock that is momentarily next to his then he will have to run his clock faster to counteract the fact that it is running slowly".

Finally, each one says: "If I want to make my clock always match the clock that is momentarily next to mine then I will have to run my clock faster to counteract the fact that his clocks are not synchronized".
 
  • #14
taking into consideration the directions of the ships relative to each other, we can say that each will be traveling towards the past of the other. this means that:

1. As the ships approach each other, time relative to each other will dilate
2. At the point of passing, time relative to each other will be equal and
3. After passing, time relative to each other will contract

these are always true for relativity. the effect will be the same for two ships with different speeds traveling in the same direction. only this time, the faster ship will be traveling towards the future of the slower ship and the slower laging in the past of the faster ship.
 
  • #15
Makep, almost none of that is correct.
 
  • #16
I was with you up to the line;

DaleSpam said:
each one says: "If I want to make my clock always match the clock that is momentarily next to mine then I will have to run my clock faster to counteract the fact that his clocks are not synchronized".

Sorry, I've read it 5 times slowly, and I still don't understand what you are saying here.
 
  • #17
Have you ever heard of space-time diagrams, cmb? I think they will answer your question.

Here is a space-time diagram for the case where one observer is stationary. The stationary observer concludes that the moving observer is slow, because he uses the green lines to compare his clock to the moving observer.

The moving observer concludes the stationary observer's clock is slow, because she uses the red lines to compare clocks, not the green ones.

attachment.php?attachmentid=37080&stc=1&d=1310342900.png


This is strange, but not paradoxical. We haven't gotten yet into why the moving obserer uses the green lines, but once you accept that they do, I hope you can see how it resolves the apparent paradox.

We can get into the why of it in another post, or perhaps you have some other questions or issues...
 
  • #18
HallsofIvy said:
Clocks on satellites and even fast moving jet planes do run slower than on the earth. That has already been verified.

Considering I don't no what role gravity plays in time dilation, I thought of a man made object with a clock that would be under the smallest amount of gravity influences. Hence the voyager space crafts.
 
  • #19
pervect said:
Have you ever heard of space-time diagrams, cmb? I think they will answer your question.
Sorry, I don't understand that diagram. (What are the dots meant to represent, a clock tick? Why are they at different gap lengths on the lines, what is the significance of the angle chosen for the red line? I'd have thought the way to draw it would be dots of equal separation along each path, then the 'observation' line is one that intersects the other path at 90deg, no?)

1) OK, let's deal with the 'real world' example above. So a satellite is launched with clock correction factors of 45,900 ns/day retardation, to counter the gravitational field effect, and 7,200 ns/day advance to account for the kinematic effects. It is launched into space and after 10 years an observer on the ground is still receiving time-stamped messages (inclusive of time-of-flight correction) from it that match the ground-based clocks, because the correct correction factor was fed in at launch. This appears to be what actually happens today, in real life. So, firstly, is my understanding of any of that incorrect?

2) But now for the thought experiment; a satellite is launched tomorrow with an astronaut on board with a life-support capsule sufficient for a 10 year mission, and he stays there for 10 years. He has no audio connection with the ground, he is a space-hermit! Each and every day, as far as he is concerned, he checks the signal from the ground based clock (inclusive of tof correction). He notes that it is losing 14,400ns/day relative to his clock, because the ground based one is running 7,200 ns/day slower, plus his clock has already been set to run 7,200 ns/day faster as well.

3) At the end of his 10 year mission he comes back to Earth and says to his flight director "over the 10 years, I observed the ground based clock gradually fall behind mine by 14,400ns/day, and the signal I got from it before I left orbit was 52 milliseconds behind mine" and the flight director says "that's funny, your clock matched ours for all of the 10 years". They compare clocks and find ...? What do they find? Did the astronaut's clock correct itself during the descent back to earth, or are the clocks at different times?

I presume paragraph 2 is where there must be an error, but I cannot see it? You might argue paragraph 3 but, obviously, we could do the thought experiment for a million years and end up with an hour's difference before he came back down to ground. The act of coming back down surely can't have a 'variable' effect on the astronaut's clock, according to how long he's been up, can it!?

So, my question boils down to: When the astronaut and flight director compared clocks, what did they find?
 
  • #20
cmb said:
I was with you up to the line;

DaleSpam said:
each one says: "If I want to make my clock always match the clock that is momentarily next to mine then I will have to run my clock faster to counteract the fact that his clocks are not synchronized".

Sorry, I've read it 5 times slowly, and I still don't understand what you are saying here.
Here is a diagram that may help:

attachment.php?attachmentid=14292&d=1212879747.png


Suppose that I am the clock at rest at x=2 (black vertical line). There are a bunch of clocks (white nearly vertical lines) passing by me at .6 c. These clocks are synchronized in their rest frame, but in my frame they are not correctly synchronized. In fact, at t=0 I look and I see that the t'=2 clock reads -1.5, the t'=0 clock reads 0, and the t'=-2 clock reads 1.5. So the closest clock is set behind, the next clock is set OK, and the clock after that is set ahead. If I want to adjust my clock so that it always reads the same as the clock passing me then when the x' clock passes me, my clock reads .66 and his reads -.66, so I set mine back to match his. But then, because I had to set mine back to match the first one, by the time the next clock (x'=0) reaches me my clock reads 2. and this next one reads 2.66. So I have to run my clock faster in order to catch up. The clock after that was set ahead, so I also have to run my clock faster in order to catch up with that one. I have to run my clock fast, not because his clocks are fast, but because they are not synchronized.
 
  • #21
cmb said:
2) But now for the thought experiment; a satellite is launched tomorrow with an astronaut on board with a life-support capsule sufficient for a 10 year mission, and he stays there for 10 years. He has no audio connection with the ground, he is a space-hermit! Each and every day, as far as he is concerned, he checks the signal from the ground based clock (inclusive of tof correction). He notes that it is losing 14,400ns/day relative to his clock, because the ground based one is running 7,200 ns/day slower, plus his clock has already been set to run 7,200 ns/day faster as well.
You cannot ignore GR over the course of 10 years. You can only pretend that the astronaut and the ground observer are equivalent inertial frames in flat spacetime for a very short time. The main problem is that in flat spacetime they would continue to get further away from each other, whereas on average the distance is not changing in this scenario.

For a beginner I would not recommend this example or this thought experiment. There are too many GR pitfalls. Stick with a zero gravity example instead.
 
  • #22
DaleSpam said:
You cannot ignore GR over the course of 10 years.
I didn't.

cmb said:
1) ... a satellite is launched with clock correction factors of 45,900 ns/day retardation, to counter the gravitational field effect, and 7,200 ns/day advance to account for the kinematic effects. ...
...This appears to be what actually happens today, in real life.
 
Last edited:
  • #23
Yes, I saw that, but I was talking about this part:
cmb said:
2) But now for the thought experiment; a satellite is launched tomorrow with an astronaut on board with a life-support capsule sufficient for a 10 year mission, and he stays there for 10 years. He has no audio connection with the ground, he is a space-hermit! Each and every day, as far as he is concerned, he checks the signal from the ground based clock (inclusive of tof correction). He notes that it is losing 14,400ns/day relative to his clock, because the ground based one is running 7,200 ns/day slower, plus his clock has already been set to run 7,200 ns/day faster as well.
That part only happens if we have two inertial frames in flat spacetime, which is not the case here. We would have to use some GR math to answer this question.
 
  • #24
DaleSpam said:
Yes, I saw that, but I was talking about this part:That part only happens if we have two inertial frames in flat spacetime, which is not the case here. We would have to use some GR math to answer this question.

So, are you thinking this is likely to all sum up to zero net effect on the synchronisation because of these effects, between the astronaut observing his fast-running clock compared to the go-slow clock on the ground?
 
  • #25
cmb said:
Sorry, I don't understand that diagram. (What are the dots meant to represent, a clock tick? Why are they at different gap lengths on the lines, what is the significance of the angle chosen for the red line? I'd have thought the way to draw it would be dots of equal separation along each path, then the 'observation' line is one that intersects the other path at 90deg, no?)

Yes, the dots represent a clock tick. The detailed reasoning for the spacing of the dots is not necessarily required to understand why the twin pardox isn't a pardox, but it's interesting, I think.

Relativity tells us that \Delta t^2 - \Delta x^2 is constant for all observers (using units where c=1, and light moves at a 45 degree angle on the graph).

The spacing of the dots is 4 graph units for the stationary observer making \Delta t =4. Since \Delta x is zero, the interval is \Delta t^2 =16 in this case, For the moving observer, the spacing is \Delta x = 3 and \Delta t =5, and 5^2 - 3^2 = 25-9 = 16. So the dots are spaced at equal 4 unit intervals. The dots which occur at \Delta t =4 have an identical spacing of "proper time", computed by the formula above, said formula giving the results which an actual clock would measure, as the dots which have \Delta x = 3 and \Delta t =5.

The angle of the red-line can be derived by another space-time diagram, I'll do that in another post since you're interested. Again, while it is interesting to know why the lines are drawn the particular way they are, the exact reasoning isn't needed to be known to understand why the twin paradox isn't an actual paradox.

1) OK, let's deal with the 'real world' example above. So a satellite is launched with clock correction factors of 45,900 ns/day retardation, to counter the gravitational field effect
and 7,200 ns/day advance to account for the kinematic effects. It is launched into space and after 10 years an observer on the ground is still receiving time-stamped messages (inclusive of time-of-flight correction) from it that match the ground-based clocks, because the correct correction factor was fed in at launch. This appears to be what actually happens today, in real life. So, firstly, is my understanding of any of that incorrect?

I haven't gone through the numbers you quote in detail, but it sounds basically correct. However, you are moving into the grounds of general relativity when you talk about gravitational effects, it would be best to postpone that until you understand special relativity fully.

2) But now for the thought experiment; a satellite is launched tomorrow with an astronaut on board with a life-support capsule sufficient for a 10 year mission, and he stays there for 10 years. He has no audio connection with the ground, he is a space-hermit! Each and every day, as far as he is concerned, he checks the signal from the ground based clock (inclusive of tof correction). He notes that it is losing 14,400ns/day relative to his clock, because the ground based one is running 7,200 ns/day slower, plus his clock has already been set to run 7,200 ns/day faster as well.

I'm getting confused by the space-traveler using a tweaked clock. If he's a hermit, and not communicating with the ground, I'd give him a standard clock. I'm not positive I understand how you intended his clock to be tweaked, (or why you'd even want to tweak it in the first place, unless he communicates with the ground constantly).
3) At the end of his 10 year mission he comes back to Earth and says to his flight director "over the 10 years, I observed the ground based clock gradually fall behind mine by 14,400ns/day, and the signal I got from it before I left orbit was 52 milliseconds behind mine" and the flight director says "that's funny, your clock matched ours for all of the 10 years". They compare clocks and find ...? What do they find? Did the astronaut's clock correct itself during the descent back to earth, or are the clocks at different times?

I presume paragraph 2 is where there must be an error, but I cannot see it? You might argue paragraph 3 but, obviously, we could do the thought experiment for a million years and end up with an hour's difference before he came back down to ground. The act of coming back down surely can't have a 'variable' effect on the astronaut's clock, according to how long he's been up, can it!?

So, my question boils down to: When the astronaut and flight director compared clocks, what did they find?

I can answer that for a non-tweaked clock, which represents proper time, easily enough, and hopefully if you know how you intended your clocks to be tweaked (though I still don't see why you'd want to), you can perhaps figure out the answer to your question.

Perhaps you are assigning some sort of special philosophical significance to the clock-tweaking? As far as I'm concerned what is of interest is what an untweaked clock would measure, as this would represent the actual passage of time for an astronaut. If I'm reading Neil Ashby right, tweaking the clocks isn't even done directly nowadays, they let the clock keep it's own proper time, and send a polynomial back and let the calculator at the receiver due the necessary corrections.

If we use http://relativity.livingreviews.org/Articles/lrr-2003-1/ as a source, and we assume a geostationary orbit, then the orbiting clock is too fast, by a factor of 4.4647*10^-10. A year is approximately 3.1557 10^7 seconds, so if we take a 10 year period, the astronaut's clock will gain .14 seconds over those ten years. Nothing special happens when the astronaut lands, he's just .14 seconds older than his twin on Earth - I'm not sure why this is confusing you, or what's confusing about it.
 
Last edited by a moderator:
  • #26
OK, onto the part where we show why the lines have the slope they do. First we have to talk about clock synchronization. There is a specified way of synchronizing clocks, called the Einstein convention, that's fairly easy to understand,that we need to follow to define the notion of "simultaneity" in relativity.

You take the worldlines of three equally spaced observers. The middle observer is moving along with the other two, and has an equal distance from both. If the middle observer emits a light pulse, the arrival time where it intersects the worldlines of the other two observers is the same, by definition,because it's traveling equal distances to both observers, and because light has a constant velocity relative to all observers.

But first, a quick review. Before we draw a complex diagram with the worldline of three obserers, let's review the diagrams for one observer.

In space-time diagrams, time points upward on the diagram, and the horizontal axis represents position. The diagrams I'll draw will all be scaled so that light is represented by a 45 degree line on the diagram, i.e. so that c=1.

Quick quiz questions for understanding:

A) Is the worldline below that of a stationary or moving observer?
attachment.php?attachmentid=37922&stc=1&d=1313088245.png


B) The same as above: is the worldline below that of a stationary or moving observer?

attachment.php?attachmentid=37923&stc=1&d=1313088643.png


Now, we get to the more complex diagrams. They'll give away the answers to the above questions, but hopefully you already knew the answer anyway.

We need to draw the worldlines of three equally spaced stationary observers, and we want the middle one to emit two light rays. We draw it out, and we get this:

attachment.php?attachmentid=37925&stc=1&d=1313089588.png
[/quote]

The red line connects two points that occur "at the same time" in the stationary frame. It's what we expect in the stationary frame, events that occur at the same time have the same t coordinate.

Now we do the same thing in the moving frame. We get this:

attachment.php?attachmentid=37996&stc=1&d=1313368347.png


We can see that the concept of "at the same time" is _different_ for the moving observer than it is for the stationary one.

What is the exact angle of the red line? Well, it turns out that the angle the red line makes with the horizontal x-axis is the same as the angle that the worldline of the moving observer makes with the vertical t-axis. This can be demonstrated formally with similar triangles.

The important conceptual point, though, even before we worry about the value of the angle, is that the notion of simultaneity is different, that the lines connecting simultaneous events in the moving frame are different (slanted) relative to the lines connecting simultaneous events in the stationary frame.
 

Attachments

  • midpoint2.png
    midpoint2.png
    1.5 KB · Views: 567
  • #27
I think you are saying paragraph 2 is incorrect because the ground-based clock would not be going 45,900 ns/day slower from gravity effect and 7,200 ns/day slower from kinematic effects relative to the satellite's time.

OK, what is it? How do I work out each element of the timing difference?
 
  • #28
DaleSpam said:
Makep, almost none of that is correct.

Why not?
 
  • #29
For one, the time dilation formula depends only on speed, not direction. It doesn't matter if they are going towards each other, away, or in the same direction, or tangentially.
 
  • #30
cmb said:
So, are you thinking this is likely to all sum up to zero net effect on the synchronisation because of these effects, between the astronaut observing his fast-running clock compared to the go-slow clock on the ground?
cmb said:
OK, what is it? How do I work out each element of the timing difference?
The elapsed time on any clock in GR is given by:
\tau=\int \sqrt{g_{\mu\nu}\frac{dx^{\mu}}{d\lambda}\frac{dx^{\nu}}{d\lambda}}d\lambda
Where x is the path taken by the clock, g is the metric, and lambda is a parameter along the path. In GR this is done for the astronaut's path, and the ground clock's path. This quantity is invariant, meaning that it doesn't matter what strange coordinate system you express it in. So the astronaut and the ground clock will both agree on how much time will have elapsed for themselves and for the other.
 
  • #31
OK, so what is it?

I've provided figures from someone else. I'll stick with them 'til there is a wiser head that uses that equation properly and modifies the +45,900 ns/day gravity effect/-7,200 ns/day kinematic effect.

I regret I cannot follow this because I'm asking straight questions to aid my understanding... and then I get an answer to a different question.
 
  • #32
cmb said:
I regret I cannot follow this
I know. That is why I suggested multiple times that you avoid this question until you have the right background. You need to learn SR before jumping into GR. Avoid scenarios involving gravity for now.
 
  • #33
I second Dale's the motion. If you don't understand SR first, you won't get anywhere with GR at all.

One way to avoid GR with something close to your original question is to place the experiment on a small asteroid, rather than Earth.

This avoids gravity, at the minor expense of the hovering spaceship having to do a powered orbit.

However, this is still not the simplest case to understand, the simplest and standard textbook case does not involve any accelerating clocks.
 
Last edited:
  • #34
DaleSpam said:
I know. That is why I suggested multiple times that you avoid this question until you have the right background. You need to learn SR before jumping into GR. Avoid scenarios involving gravity for now.
Sorry, but I am not following it because I am not getting any direct answers. I'm sure it would help if I knew more, but I want to work with some numbers first because to me that is easier to make 'real sense' out of this specific paradox than these space-time plots.

I do appreciate you trying, but the way I see it is that when we look up at a GPS satellite it is running Xns/day fast due to a lower gravity field, and Yns/day slow due to kinematic effects. If we were to look down from the satellite, we would see clocks running Xns/day slow due to a higher gravity field, and Yns/day slow due to kinematic effects.

I don't see why that is a wrong understanding, nor anything in your equation that says it is wrong, so I'm unable to progress this dialogue where there is no direct response to that essential question. The answer is yes or no, and maybe we can take it from there once there's a definite answer to whether that last paragraph is correct.
 
Last edited:
  • #35
cmb said:
If we were to look down from the satellite, we would see clocks running Xns/day slow due to a higher gravity field, and Yns/day slow due to kinematic effects.
The problem is that this separation into gravitational and kinematical effects can only be done where the gravitational field is weak and not changing over time. In the astronaut's inertial frame the gravitational field changes over time, so you cannot decompose it like that. All you can do is evaluate the integral that I provided, which is guaranteed to agree with the previous overall calculation regardless of what coordinate system you use.

We are not trying to be evasive. I have, in fact, provided the answer, even though I knew you would not understand it. You simply need to learn SR first, you are going to be unsuccessful with the GR-first approach.
 
Last edited:
  • #36
pervect said:
One way to avoid GR with something close to your original question is to place the experiment on a small asteroid, rather than Earth.

This avoids gravity, at the minor expense of the hovering spaceship having to do a powered orbit.
This is a good suggestion.

cmb, would you be interested in this approach?
 
  • #37
cmb said:
I undertand the question, but there again there is something about that question that makes me hesitate. One clock must be sped up relative to the other so that they stay in sync. I'm finding it tough to fathom that you can slow one OR other, whichever, and still end up with a 'consistent' time frame where both match up.

This get's at a key misunderstanding. I think both Dalespam and Pervect have answered it, but I'll try a more direct answer.

IF you want all clocks to be synchronized for a ground observer (obviously what you want for a GPS system for use by people on the ground), you speed up the orbital clocks. However, from the point of view of any orbital clock, it is not synchronized relative to a ground clock. In fact, the adjustment of the orbital clock increased de-synchronization from the point of view of the orbital clock.

If you want all clocks synchronized from the point of view of a given orbital clock, you have to speed up the ground clock. You would also have to make complex, time varying adjustments to other orbital clocks for them all to synchronized from the point of view of a given orbital clock.

I am guessing you are thinking, on some level, that synchronization is objective. Instead, it is observer dependent.
 
  • #38
PAllen said:
This get's at a key misunderstanding. I think both Dalespam and Pervect have answered it, but I'll try a more direct answer.

IF you want all clocks to be synchronized for a ground observer (obviously what you want for a GPS system for use by people on the ground), you speed up the orbital clocks. However, from the point of view of any orbital clock, it is not synchronized relative to a ground clock. In fact, the adjustment of the orbital clock increased de-synchronization from the point of view of the orbital clock.

If you want all clocks synchronized from the point of view of a given orbital clock, you have to speed up the ground clock. You would also have to make complex, time varying adjustments to other orbital clocks for them all to synchronized from the point of view of a given orbital clock.

I am guessing you are thinking, on some level, that synchronization is objective. Instead, it is observer dependent.

I don't think there is any misunderstanding. I have not said anything against this at all, and recognise it as the 'simple' description you'd give a first-timer at SR.

But I am taking it one step further than this. I am asking what happens when you bring those mismatched clocks back into the same gravitational and inertial frame, put them on the same table together, and look again. Now it most definitely IS objective. There is no longer any observer-dependence. So if you agree (as per my previous comments) that indeed the two clocks de-synchronise differently, my question remains - does the act of bringing them back together somehow re-synchronise them?
 
  • #39
cmb said:
I don't think there is any misunderstanding. I have not said anything against this at all, and recognise it as the 'simple' description you'd give a first-timer at SR.

But I am taking it one step further than this. I am asking what happens when you bring those mismatched clocks back into the same gravitational and inertial frame, put them on the same table together, and look again. Now it most definitely IS objective. There is no longer any observer-dependence. So if you agree (as per my previous comments) that indeed the two clocks de-synchronise differently, my question remains - does the act of bringing them back together somehow re-synchronise them?

No. If you bring an adjusted orbital clock back to Earth in some smooth way, without changing any adjustments made to it, it will have an accumulated offset and a faster rate relative to the unadjusted ground clock.
 
  • #40
PAllen said:
No. If you bring an adjusted orbital clock back to Earth in some smooth way, without changing any adjustments made to it, it will have an accumulated offset and a faster rate relative to the unadjusted ground clock.
...Yet for someone who has been watching this clock from the ground all the time it was up there, it had looked like it was on time all that while?
 
  • #41
You cannot "watch" an orbiting clock from the ground and agree that it has the correct time on it. Two people at various locations on the ground would see the clock with different times on it because they would have different light travel times. The only reason GPS can be used as an accurate time keeper for everybody on Earth at the same time is because it also performs the function of positioning so the receivers know how to compensate for the light travel time. It's a very complex process and it's awesome that you can buy GPS receivers for less than $100 that tell you the time to within a fraction of a microsecond. When you consider that light travels 1000 feet per microsecond, this is quite remarkable.

In any case, once two clocks have become desynchronized, they do not automatically resynchronize to the same time just because they are colocated. Their rates will become the same so that whatever difference they have between them will remain forever, but that is not what we mean by synchronized. They also have to display the same time and have the same rate to be synchronized.
 
  • #42
ghwellsjr said:
You cannot "watch" an orbiting clock from the ground and agree that it has the correct time on it. Two people at various locations on the ground would see the clock with different times on it because they would have different light travel times.
This is a thought experiment, I can set up any conditions I like: I send a satellite into a perfect circle at a constant distance from my perfectly round earth, and I take a signal from it each time it is directly overhead. I know the distance and can adjust for ToF of the signal perfectly.

If I adjust the clock I send into orbit such that it always matches my clock on Earth when it comes overhead, then the clocks remain perfectly matched - for however long the satellite stays up there. However, for someone floating with the clock they observe the clocks increasingly desynchronise.

Just before the clock begins its decent back to earth, on the ground I am reading that the clocks are perfectly synchronised, because that is how I have set them up in my thought experiment. But to the guy floating with the clock, they look desynchronised because they have always been increasing their desynchronisation. For me, I monitor the clock coming back down to Earth and would I notice that they become more desynchronised as the clock gets closer to me, and if so by how much? That act of the clock coming back to me surely cannot change the time on the clock by a variable amount, according to how long the satellite has been up there. So what do I see the radio signal, from the clock as it is coming back down to earth, telling me? Do I see the time signal slowing down considerably for the period of its decent, so that, by the time it gets to me, it matches the desynchronised time the guy floating with it saw and expects to see when he gets down here?

If so, why would the rate of 'correction' during the same descent path be different depending on whether I bring the satellite down after one month compared with 10 years?

If not, then the clocks will read the same time and the guys who was with the clocks gets real confused?
 
  • #43
cmb said:
...Yet for someone who has been watching this clock from the ground all the time it was up there, it had looked like it was on time all that while?

Yes, while it was in orbit. It would start going out of synch, from ground observer's point of view, when its motion and position in gravity well changed. If it looked in synch for some position (in gravity well) and state of motion, why would one expect it to remain so when these things change?
 
  • #44
PAllen said:
If it looked in synch for some position (in gravity well) and state of motion, why would one expect it to remain so when these things change?

Please see my post which hit the board at the same time as yours.

I have no problem with the synch looking different once it is returned to earth, but why would it be a variable amount of synch according to what has happened in the previous years?

If you visited me in my control shack and we watched my satellite clock just before its return to earth, you'll say 'when that gets back here, it'll be desyncronised because its changed its gravtiy field, etc.' and I say 'Yup, sure will. But, tell me, by how much will it be desynchronised once I bring it back down this particular flght path?'

Y'see, at that moment in time, the 'actual' (or 'latent', if you will) desychronisation is independent of how long it has been up there, but this is inconsistent with the observations of the guy floating along with it who has watched the desynch build up.
 
  • #45
cmb said:
This is a thought experiment, I can set up any conditions I like: I send a satellite into a perfect circle at a constant distance from my perfectly round earth, and I take a signal from it each time it is directly overhead. I know the distance and can adjust for ToF of the signal perfectly.

If I adjust the clock I send into orbit such that it always matches my clock on Earth when it comes overhead, then the clocks remain perfectly matched - for however long the satellite stays up there. However, for someone floating with the clock they observe the clocks increasingly desynchronise.

Just before the clock begins its decent back to earth, on the ground I am reading that the clocks are perfectly synchronised, because that is how I have set them up in my thought experiment. But to the guy floating with the clock, they look desynchronised because they have always been increasing their desynchronisation. For me, I monitor the clock coming back down to Earth and would I notice that they become more desynchronised as the clock gets closer to me, and if so by how much? That act of the clock coming back to me surely cannot change the time on the clock by a variable amount, according to how long the satellite has been up there. So what do I see the radio signal, from the clock as it is coming back down to earth, telling me? Do I see the time signal slowing down considerably for the period of its decent, so that, by the time it gets to me, it matches the desynchronised time the guy floating with it saw and expects to see when he gets down here?

If so, why would the rate of 'correction' during the same descent path be different depending on whether I bring the satellite down after one month compared with 10 years?

If not, then the clocks will read the same time and the guys who was with the clocks gets real confused?
It is possible to construct a pair of clocks as you described so that they both keep exactly the same time on them all the time and in synchronization if you are willing to calculate out the ToF. In general, the orbiting clock will be time dilated because of its speed and the Earth clock will be time dilated because of gravity. If the amount of time dilation is the same then you don't have to do anything to make one run faster than the other. Then when you bring the orbiting clock down to the surface of your earth, it might be possible that they could remain in sync. If the amount of time dilation is different, then you would have to make one of the clocks run faster to be able to make them keep the same time. Then when you bring them together, they will start going out of sync.

I hope this is correct, I may have overlooked something. And remember, this is on a fictitious Earth of your choosing that may be a different size, mass, spin rate, than our real earth.
 
  • #46
cmb said:
I have no problem with the synch looking different once it is returned to earth, but why would it be a variable amount of synch according to what has happened in the previous years?

I don't think this would happen with such an adjusted clock. The time reading deviation would only begin accumulating from when its motion from what it was specifically adjusted for.
 
  • #47
PAllen said:
I don't think this would happen with such an adjusted clock. The time reading deviation would only begin accumulating from when its motion from what it was specifically adjusted for.
But this is the whole point of my question. The kinematic effects* would mean that from the satellite you'd see the Earth clock falling behind but also the satellite clock has already been adjusted to run fast aswell. Why would a satellite observer not see a kinematic desynch?


*[we take it as read that the Gravity effects are known and compensated - they are not observer dependent insofaras the observers know what g field they're in]
 
  • #48
cmb said:
But this is the whole point of my question. The kinematic effects* would mean that from the satellite you'd see the Earth clock falling behind but also the satellite clock has already been adjusted to run fast aswell. Why would a satellite observer not see a kinematic desynch?


*[we take it as read that the Gravity effects are known and compensated - they are not observer dependent insofaras the observers know what g field they're in]

The satellite observer and a funny adjusted clock are two different things. The satellite observer would see an accumulating difference between the funny clock and a normal clock on the satellite (one brought from the ground without any adjustments). If the satellite had a funny clock and a regular clock, then when the satellite was brought down:

- the regular clock would now be going at the same rate as the ground clock, but with a time reading difference proportional to the orbit time.

- the funny clock would now be going fast compared to the ground clock, with a time difference accumulating only from when the satellite's motion changed from what the funny clock was adjusted for.
 
  • #49
cmb said:
If I adjust the clock I send into orbit such that it always matches my clock on Earth when it comes overhead, then the clocks remain perfectly matched - for however long the satellite stays up there. However, for someone floating with the clock they observe the clocks increasingly desynchronise.
I wonder why you think an observer traveling with the orbiting clock would think that it increasingly desynchronises with the ground clock. I'll bet you're thinking that when two clocks are traveling inertially with respect to each other (no gravity) then they each observe that they are running slower than the other and therefore increasingly desynchronize, right? And so now you're thinking that the same thing happens (neglecting gravity and GR) with an orbiting clock and a stationary clock on the surface of earth, is that what you're thinking?

If so, then you have overlooked the fact that the orbiting clock is not inertial, it is always accelerating toward the Earth so in this case (neglecting rotation of the surface of the Earth where the "stationary" clock is), one clock is inertial and the other is not. This means that the orbiting clock will be time dilated while the Earth clock is not and so everytime the orbiting clock is overhead of the Earth clock, they will each "see" that the time dilation is occurring on just the orbiting clock.

An observer traveling with the orbiting clock will not see the two clocks desynchronising if they have been adjusted so that they are keeping the same time. In fact no observer will see them desynchronising.

Does this help?
 
  • #50
cmb said:
But this is the whole point of my question. The kinematic effects* would mean that from the satellite you'd see the Earth clock falling behind but also the satellite clock has already been adjusted to run fast aswell. Why would a satellite observer not see a kinematic desynch?


*[we take it as read that the Gravity effects are known and compensated - they are not observer dependent insofaras the observers know what g field they're in]
As I said before, you cannot separate the kinematic and gravitational effects in a time varying field like in the satellite frame.
 
Back
Top