Iv'e been recently interested in time dilation, but the relative time difference between two observers confuses me (i.e. that a high speed observer, and a stationary observer will each perceive the other's clock to run slow.) I thought of the following experiment to help me understand, but i'm not sure if i'm correct about it. Imagine four observers (A,B,C,D) and a disk rotating initially at rest. Observer A is at the centre of the disk. Observer B is at the outer edge of the disk. Observer C is somewhere between A & B. Observer D is outside the disk. (i.e. will not feel the effects of the rotation.) The disk rotates to very close to the speed of light, maintains this speed for a while, then decelerates back to rest velocity. I think that the oder of most time experienced to least time experienced (relative to the others) will go: A = D > C > B Am I right with this?