# More IR Frame confusion.

1. Jun 14, 2006

### MonstersFromTheId

How does the "clock" know when to slow down?

Screwy question, I know, but the following thought experiment might help to make clear what I'm asking.

You and a clock are in a box. You can't see outside the box. You have no external references.

Inside the box you appear to be in free fall, i.e. no forces acting on you that you can detect.

From a God's eye point of view you and your little box are speeding through an empty part of space at constant speed, and your clock is ticking away at a constant rate. (Not that you can tell that from inside the box, but that's what's going on).

Unbeknownst to you as you sit in your little box (possibly watching old Seinfeld re-runs, or making a sandwich), your little box begins to pass within range of a steep gravity well.

So now your box starts to fall toward the gravity well. As it does, you start to pick up speed, and as you pick up speed your clock begins to tick more slowly (not that you can tell, since time is slowing for you as well). But the pertinent fact of the matter is that you clock IS starting to tick more slowly. Just because you can't tell it's ticking more slowly doesn't change the fact that it is.

So if there's no such thing as an absolute reference frame, how does your clock know enough to start ticking more slowly?

2. Jun 14, 2006

### G01

Your problem is that you are thinking of the clock as slowing down. Its not. The time dimension itself has changed. The clocks hand still moves one space every second, but the difference is that one second takes longer COMPARED TO another reference frame. The act of "slowing down" requires no knowledge of anything on the clocks part. It is still doing the same job, counting out seconds. The difference is that time itself is different compared to another reference frame.

3. Jun 15, 2006

### pervect

Staff Emeritus
All clocks tick at the rate of 1 second / second. When we say a clock slows down, we mean that when we compare the rate of that clock to some reference clock, we observe that our clock ticks slower than the reference clock.

It is usually but not always implied that there is some signal path of known and fixed time delay present between our clocks and the reference clock in order to make the comparison.

In this example, there is no reference clock, hence there is no way for any experiment to deterimine that the clock in the box is ticking more slowly (as long as the box is small).

If you watch the clock, for instance, it will appear to be ticking normally to you, for your time will be the same as the clocks time.

4. Jun 15, 2006

### yogi

Einstein caused some uncertainty as to what takes place in his original description of what happens to clocks when they are put in motion - he describes the situation where two separated clocks are synchronized and then one is put in motion and when it reaches the other clock it will be out of sync (it will read less). He then goes on to further embellish upon what he calls a "peculiar result" by stating that a clock at the equator will run slower than one at the North pole (this presumes they are both at the same gravitational potential - which is not the case). But the interesting aspect of this assertion is whether the equitorial clock (the one moving about about 1000 mph relative to the clock at the North Pole) is actually running slower, or whether the moving clock simply accumulates less time during a revolution because it travels an effective contracted spatial distance - both analysis lead to the same result - an actual loss of time between the equitorial clock and the Pole clock as measured by the time of each revolution - but most authors reject the notion that one second per second in one frame is different that one second per second in another frame.

Last edited: Jun 15, 2006