Greets everyone. I am having great difficulty trying to explain to a group of friends who do not understand math too well that the unit we know as time does not speed up or slow down since 1 second is no different than marks on a ruler and has an established interval length, in this case duration and all clocks do is count ticks or micro ticks. Therefore since that is the case it is not the unit time that speeds or slows, the count change, more or less, results from more or less external interference to the device. In my explanation I likened it to two cars with the identical horsepower and speed one driving on wet pavement and one on dry pavement. The car driving on wet pavement will show that it went more miles despite the distance is the same. Given known parameters we can solve for the error due to slippage. I have also looked for material they can study that clearly explains it in a non-physics level with no real luck. Maybe someone here has an explanation that better explains these distinctions to help these guys wrap their minds around it? I think I have this reduced as far as possible but maybe not.... I am sure you all must have run into this before? Any ideas?