Consider two synchronized clocks at rest with respect to each other. Merely accelerating one clock does not de-synchronize it. You could instantly accelerate a clock to some high speed and then just as instantly decelerate it back to its original condition and the time will not have changed on it at all. Accelerating one clock causes it to be at a different speed than the other clock which causes it to tick at a different rate than the other clock but it has to remain at that different speed for some time in order to accumulate a different elapsed time when brought back to the other clock that experienced no acceleration.
I think the easiest way to address your concerns is to think about the Doppler effect and what it would be like if there were no relativity. I encourage you to look up "
Doppler effect" in wikipedia. There you will see some formulas to involving the observed frequency
f and the emitted frequency
f0. But instead of thinking about frequencies, just think about how each clock views the other one. Specifically, look at this formula where c is the speed of light in a fixed ether medium, v
r is the velocity of the receiver through the ether and v
s is the velocity of the source through the ether:
Just look at the part of the formula inside the parentheses. Let's say, just to get familiar with the formula, that the source clock is moving away from the receiver clock at 10% of c through the medium but the receiver clock is stationary in the medium. That means that the stationary clock will see the moving clock ticking slower than itself by a factor equal to:
c/(c+0.1c) = 1/1.1 = 0.90909
Now let's say instead that the source clock is stationary in the medium but the receiver clock is moving in the other direction at the same speed of 10%c. Now the source clock will see the receiver clock as ticking slower than itself by a factor of:
(c-0.1c)/c = 0.9/1 = 0.9
And you could calculate how the other clock views the first clock in each situation.
Clearly, there is a difference in how each clock sees the other one ticking, based on their speeds relative to the fixed medium, even though the relative speed between them is the same. If there were no relativity, then this difference in the observed tick rates between two clocks in relative motion would change depending on how they are moving through the ether.
But that is not what actually happens. What actually happens is that both clocks see the other one as ticking at the same rate. Please look up the wikipedia article called "
Relativistic Doppler effect". Now we have a new formula where f
o is the observed frequency, f
s is the source frequency and β is the relative speed as a fraction of the speed of light:
Just look at the square root part of the formula. For our example of 10%c, β = 0.1 and:
√[(1-0.1)/(1+0.1)] = √[0.9/1.1] = 0.904534
This value is somewhere between the two values that we got before and more importantly, it is the factor that each clock sees the other one ticking at compared to their own. This is really quite an amazing and surprising result because it means that what we observe does not depend on a medium to propagate that observation and any attempt to identify one is doomed to frustration. If we could identify one, then we could take into account the light propagation time to determine the actual time displayed on the remote clock at each location as it moved away from us. Instead, it is not possible to arrive at a conclusion in a determinate way.
Prior to Einstein, scientists believed that the light propagation time was fixed according to an absolutely stationary medium, even if they could not determine its state but Einstein figured out that you could assume any fixed state to be the one in which light propagated at c and this is the basis for Special Relativity.