Trying to Explain Time Dilation

  • B
  • Thread starter TLR
  • Start date
  • #1
TLR
10
0

Main Question or Discussion Point

Greets everyone.

I am having great difficulty trying to explain to a group of friends who do not understand math too well that the unit we know as time does not speed up or slow down since 1 second is no different than marks on a ruler and has an established interval length, in this case duration and all clocks do is count ticks or micro ticks.

Therefore since that is the case it is not the unit time that speeds or slows, the count change, more or less, results from more or less external interference to the device.

In my explanation I likened it to two cars with the identical horsepower and speed one driving on wet pavement and one on dry pavement. The car driving on wet pavement will show that it went more miles despite the distance is the same. Given known parameters we can solve for the error due to slippage.

I have also looked for material they can study that clearly explains it in a non-physics level with no real luck.

Maybe someone here has an explanation that better explains these distinctions to help these guys wrap their minds around it?

I think I have this reduced as far as possible but maybe not.... I am sure you all must have run into this before?

Any ideas?
 

Answers and Replies

  • #2
FactChecker
Science Advisor
Gold Member
5,581
2,058
I'm not sure exactly what you mean, but I think I disagree. The time dilation of general relativity is a real change when observed by an external observer. Suppose person A is traveling near the speed of light compared with person B. Then person B observes the clocks of person A ticking slower. Person A can not detect any change in his clock. In fact, person A thinks it is person B's clocks that are ticking slower. And it is not just the clocks. All molecular, biological, physical, etc. processes are observed to slow down. This is not due to any "external interference". It is due to a disagreement between A and B on how to synchronize widely separated clocks.
 
  • #3
29,562
5,885
In my explanation I likened it to two cars with the identical horsepower and speed one driving on wet pavement and one on dry pavement. The car driving on wet pavement will show that it went more miles despite the distance is the same. Given known parameters we can solve for the error due to slippage.
I like the general approach, but I would not involve any slippage or error, just geometry.

Suppose you have two friends that are driving from Miami to Boston, one will drive through Washington DC, and the other will drive through Chicago. Even though they both cover the same number of degrees of latitude, they will cover different numbers of miles on their respective odometers. The analogy is that the latitude lines are like coordinate time in an inertial frame, and the odometers are like the proper time on a clock.
 
  • #4
1,241
189
I would include gravitational time dilation in your explanation as well, that time runs faster at a higher gravitational potential (farther from source of gravity) and runs slower the deeper in the gravity well you go. As in the example of the ISS in orbit, time runs slower due to the higher velocity and at the same time it runs faster due to the gravitational potential, cancelling out much of the effect.
 
  • #5
FactChecker
Science Advisor
Gold Member
5,581
2,058
For a book that assumes no strong physics or math background, I recommend Epstein's book, "Relativity Visualized".
 
  • #6
867
60
Einstein's original paper actually explains special relativity pretty clearly, and does not use much math at all. I read it when I was in 4'th grade and understood it. I've explained his original train thought experiment to other kids of similar age, and they understood it.
 
  • Like
Likes FactChecker
  • #7
220
54
the unit we know as time does not speed up or slow down since 1 second is no different than marks on a ruler and has an established interval length
When observer and clock are in the same reference, true. Why would your friends think their wrist watches tick at different speeds at different times? Do they think when nobody is looking some clocks speed up and others slow down? Unlikely,right?

I am having great difficulty trying to explain to a group of friends who do not understand math too well that the unit we know as time does not speed up or slow down..
So its seems you are likely discussing relativity. There is no 'logic' any more than there is 'logic' to why some charges repel, others attract, and gravity is always attractive. These are inherent characteristics of our universe. Likewise, despite everyday perceptions/impressions, time and distance are not absolute, but relative. It took an "Einstein' to recognize that and then develop the underlying theory. Maybe the best you can do is to point out that when relative speeds approach that of the speed of light, unusual circumstances occur that our everyday experience finds unusual.

description w/o gravity:
Two observers moving at high speeds relative to each other will not see each other's time progressing [ticking] at the same rate as their own. Each observes the others clock ticking slower than their own. So locally, time always ticks along at the same rate when the clock and observer move together [in the same reference frame]; but when measured from another high speed reference frame relative to the clock, time does not move uniformly.

If you read here, for example,
https://en.wikipedia.org/wiki/Special_relativity
you'll not find a magical sentence that 'logically' explains what happens....One starts with some 'logical' postulates [at least they seem that way today] and
 
  • #8
TLR
10
0
I like the general approach, but I would not involve any slippage or error, just geometry.

Suppose you have two friends that are driving from Miami to Boston, one will drive through Washington DC, and the other will drive through Chicago. Even though they both cover the same number of degrees of latitude, they will cover different numbers of miles on their respective odometers. The analogy is that the latitude lines are like coordinate time in an inertial frame, and the odometers are like the proper time on a clock.

Well the distinction that I am trying to express is the distinction between the expected duration of a second, and deviation from that expected duration based on a reference standard. Actually I do not see relativity in and of itself as having anything to do with my point beyond the fact it introduces an error, and by error I mean anything that results in two clocks reading differently under different conditions. Altitude for instance. Time does not speed up or slow down based on the conventional understanding of time which is based on the idea it is nothing more than a measuring stick.
How about this one......so we have a properly working pendulum clock at sea level keeps very good time. Now we take this same clock and lower it one foot into the water and the clock ticks slower. The dial of the clock under water is now far behind the dial on the clock above water. My point is that it is improper to simply state it as 'time' runs slower. First off because time is merely a measuring stick based on a predetermined number of intervals of a specific duration from some oscillating device generally be it a pendulum or be it radiation cycles. My point is that if different conditions by any means what so ever causes our dials to read differently that is the introduction of error. We haven't even gotten as far as determining the cause at this point. Since we know this caused an error in our reading we can not 'properly' say time itself slows down when our device is under water instead we would need to state that the water introduces error because our time device is imperfect, then add auto correction system so all clocks read the same under all conditions.

I am saying that the effects we know as time dilation is properly stated as measuring device error regardless if its due to gravity, motion or in my example, placing it under water. Either that or we really do not have a base standard for the term (unit) we know as a 'second' which would really mess things up since without a known reference standard which I presumed we had. Otherwise the definition of time could be anything anyone wanted to dream up and ever changing based on any infinite number of conditions that could cause (dial position) error to be introduced. Do you find any material errors in this premise?
 
Last edited:
  • #9
Nugatory
Mentor
12,769
5,369
I am saying that the effects causing time dilation is properly stated as error due to gravity or motion or in my example water. I see this as nothing more than altimeter correction, unless we do not have a base standard for the term second which would really mess things up.
That's a very misleading way of thinking about, and almost guarantees confusion in any but the simplest problems. We do indeed have a solid base standard for the second - but it doesn't behave the way you want it to.

The second is defined as the time elapsed during 9,192,631,770 cycles between the two ground states of a cesium-133 atom (not subject to any external forces and at rest). In principle I could put a cesium atom in a box with a counter and a readout and carry it around with me instead of my wristwatch. It would be an excellent clock, accurately tracking all the time-dependent processes going on around me; I would experience one second of life for every second it ticked off, the time-dependent processes happening in my laboratory would work according to the time it recorded, my soft-boiled eggs would come out right if I use it to time them...... You could do the same thing as well, and your cesium-based clock would work just as well for you. That's the definition of the second, and what Einstein was getting at when he said "Time is a what a [good] clock measures".

However, there is a catch. If you and I are moving relative to one another, or if we are at different heights in a gravity well, or just about any conditions other than being in the same place at the same time, then we will find that even if our clocks both tick once at the same time, the next ticks will not come at the same time. Thus, we both have a perfectly good second that conforms to all the laws of physics and is free of errors to better than ten decimal places, but neither one is more "standard" than the other.
 
  • #10
1,241
189
add auto correction system so all clocks read the same under all conditions.
Then look at a GPS pointing to a specific location on Earth and realize that it takes very accurately the amount of adjustment predicted by relativity and suppose it could be statistical in nature??
 
  • #11
TLR
10
0
However, there is a catch. If you and I are moving relative to one another, or if we are at different heights in a gravity well, or just about any conditions other than being in the same place at the same time, then we will find that even if our clocks both tick once at the same time, the next ticks will not come at the same time. Thus, we both have a perfectly good second that conforms to all the laws of physics and is free of errors to better than ten decimal places, but neither one is more "standard" than the other.
What I need, to agree with you is not a perspective or parallax explanation, but a reasonable explanation how two identical clocks can be claimed to operate perfectly that give us wrong answers simply by being subjected to different conditions. (gravity/velocity etc)

How can we say the clocks are in perfect working order while claiming they do not give the same time under different physical conditions.

The problem I see with this is that in effect you just told me we do not have a real standard for time after all.

Velocity and gravitational time dilation combined-effect tests
Hafele and Keating, in 1971, flew caesium atomic clocks east and west around the earth in commercial airliners, to compare the elapsed time against that of a clock that remained at the U.S. Naval Observatory. Two opposite effects came into play. The clocks were expected to age more quickly (show a larger elapsed time) than the reference clock, since they were in a higher (weaker) gravitational potential for most of the trip (c.f. Pound–Rebka experiment). But also, contrastingly, the moving clocks were expected to age more slowly because of the speed of their travel. From the actual flight paths of each trip, the theory predicted that the flying clocks, compared with reference clocks at the U.S. Naval Observatory, should have lost 40±23 nanoseconds during the eastward trip and should have gained 275±21 nanoseconds during the westward trip. Relative to the atomic time scale of the U.S. Naval Observatory, the flying clocks lost 59±10 nanoseconds during the eastward trip and gained 273±7 nanoseconds during the westward trip (where the error bars represent standard deviation).[24]
IMO They properly express it as standard deviation or 'error' while improperly expressing it as 'age'. It seems we are claiming the clocks run perfectly while simultaneously claiming that gravity and velocity introduces errors from these supposed perfectly running machines and rather than dealing with the error as error we choose instead to erroneously call it 'bending time' using the word 'age' instead when none of these studies made or can lead to any such determination [age] that I can see. It would seem we are talking about minute errors possibly due to red/blue shift or some other undesirable disturbance the clocks are not protected against.

It seems to me that long before we get to the point of discussing relativity the above demonstrates we have a serious problem in that the clocks are not accurate in the first place since they are so easily affected by gravity and velocity and that said it appears to be a gigantic leap in logic (fallacy) to conclude the unit known as time itself is changing. In fact how can we even measure the accuracy of clocks in motion in the first place and expect to get good answers when the devices used to measure the clocks are effected by the same disturbances, unless we identify a reference which I was under the impression was the national standards institute. It would imply that the world needs to agree on one reference on the globe if we all wish to agree on a given time as a reference.
I would like to hear a feasible counter proposal if people here do not agree that we have insufficient precautions built into our clocks to give us perfect time and why the errors incurred instead of being thought of as such instead is being presented to give the impression of age and somehow changes the definition of time?
In your explanation if the clocks are working perfectly then the tick do have to take place at the same instant, and it is a matter of perspective imo which is different than clocks being unable to produce accurate time under any and all conditions we expect them to operate under.

Another example of error would be if we take a car and have precision amount of applied torque then run it both with the wind and against the wind where both measurements will be different than the car that is stationary running on a dyno. MPH then is a consequence of the physical interference of the wind when compared to the ideal. This does not change the definition of miles per hour but it is relative to the conditions being operated under. That said why would a nearly identical situation with these clocks change the definition of time?
 
Last edited:
  • #12
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
16,177
6,183
What I need, to agree with you is not a perspective or parallax explanation, but a reasonable explanation how two identical clocks can be claimed to operate perfectly that give us wrong answers simply by being subjected to different conditions. (gravity/velocity etc)

How can we say the clocks are in perfect working order while claiming they do not give the same time under different physical conditions.
But it does NOT give "wrong answers", it gives us RIGHT answers. Your problem continues to be that you think the differences are an error when they are not. In order for any "error" to be involved, you would have to say that one frame of reference is right and the other is wrong, and that is just flat out incorrect.
 
  • #13
TLR
10
0
But it does NOT give "wrong answers", it gives us RIGHT answers. Your problem continues to be that you think the differences are an error when they are not. In order for any "error" to be involved, you would have to say that one frame of reference is right and the other is wrong, and that is just flat out incorrect.
so does the car example I gave above which also gives correct answers though neither match the standard. If we can justify changing the definition of time why not also change the definition of miles per hour? Its fair game is it not?

Yes i am calling those differences an error as they both are a deviation from the standard and deviation is generally thought and handled as error.
 
  • #14
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
16,177
6,183
so does the car example I gave above which also gives correct answers though neither match the standard. If we can justify changing the definition of time why not also change the definition of miles per hour? Its fair game is it not?

Yes i am calling those differences an error as they both are a deviation from the standard and deviation is generally thought and handled as error.
But in the case of time dilation, there IS no "deviation from the standard". A stationary clock and one traveling relative to it or in a different gravity well both tick at exactly the standard one second per second. There is NO deviation. You persist in thinking that one frame of reference is right and one is wrong.

Think about this. I look at a guy 6' tall who is 10' from me and another guy who is 6' tall but who is 50' from me. The guy that is closer looks taller, but he is NOT taller.
 
  • Like
Likes einswine
  • #15
29,562
5,885
The dial of the clock under water is now far behind the dial on the clock above water. My point is that it is improper to simply state it as 'time' runs slower.
So, although this particular clock runs slow when placed under water, there are many other clock designs that would not. A quartz oscillator, a radioactive sample, a spring wound pocket watch, even another pendulum clock in a waterproof housing. Furthermore, different clocks which are affected by water will disagree about the size of the water effect. Because different clocks will disagree about the effect of the water, we would indeed consider it to be an error in the clocks that were affected by the water.

The principle of relativity is much more far reaching. It asserts that all clocks of whatever construction and operating principle will be affected. Furthermore, all clocks will be affected by exactly the same degree. This has been tested and confirmed for clocks based on EM, the weak nuclear force, and the strong nuclear force.

To attribute this to a clock error would require an explanation for how the error is introduced into each clock mechanism. Each explanation would also have to coincidentally come out exactly the same value, which would also have to coincidentally be the amount predicted by relativity. People tend to be suspicious of so many coincidences.
 
  • #16
TLR
10
0
so does the car example I gave above which also gives correct answers though neither match the standard. If we can justify changing the definition of time why not also change the definition of miles per hour? Its fair game is it not?

Yes i am calling those differences an error as they both are a deviation from the standard and deviation is generally thought and handled as error.

Yes they will count the same number of ticks before confirming one second so if the time is different then the speed of the ticks also must be different what else could it be since the counter only counts ticks?
 
  • #17
Nugatory
Mentor
12,769
5,369
What I need, to agree with you is not a perspective or parallax explanation, but a reasonable explanation how two identical clocks can be claimed to operate perfectly that give us wrong answers simply by being subjected to different conditions. (gravity/velocity etc)
Let's be precise about what you are expecting properly functioning clocks to do here. Say we have two identically constructed clocks, each ticking once a second based on counting the cycles of the cesium atoms buried in their guts. They are moving at a constant speed relative to one another (note that I am being careful not to say that one of them is at rest and the other is moving - all we have is their relative speed so we can equally well consider either to be the one that's moving). At the same time that my clock reads 0, your clock reads 0. After 10 ticks of my clock it of course reads 10. I look at your clock to see what it reads at the same time (which means that I have to allow for light travel time - if you are five light-seconds away, what I see of you when my clock reads 8 is what happening to you at the same time that my clock reads 3).

You are expecting that at the same time that my clock reads 10, your clock will also read 10, so that anything would indicate that one or the other clocks is somehow malfunctioning and giving "wrong" answers.

But note the vital importance of that phrase "at the same time" which I have bolded above. Are you familiar with Einstein's train thought experiment about the relativity of simultaneity? If not, google for "Einstein train relativity of simultaneity" and be sure that you completely understand what it's telling you - until you have relativity of simultaneity down cold, you will not be able to wrap your mind around time dilation.
 
  • #18
TLR
10
0
The principle of relativity is much more far reaching. It asserts that all clocks of whatever construction and operating principle will be affected. Furthermore, all clocks will be affected by exactly the same degree. This has been tested and confirmed for clocks based on EM, the weak nuclear force, and the strong nuclear force.

To attribute this to a clock error would require an explanation for how the error is introduced into each clock mechanism. Each explanation would also have to coincidentally come out exactly the same value, which would also have to coincidentally be the amount predicted by relativity. People tend to be suspicious of so many coincidences.
So that would rule out a sundial I take it ;)

All clocks are matter and operate based on oscillation or movement so I would claim we simply have not been able to find the common denominator does not mean it does not exist neither does it mean it is not within practical reason. I do not claim to have all the answers but neither can I accept that a construct we based on intervals is what changes.

The construct time is abstract and ideal with assigned parameters, the intervals are physical and not ideal and suffer from external influences.
 
  • #19
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
16,177
6,183
...neither can I accept that a construct we based on intervals is what changes.
.
BUT IT DOESN'T CHANGE. You keep repeating the same mistaken point of view over and over. That isn't going to make it right. I can only repeat Nugatory's advice that you study the relativity of simultaneity. A look into the concept of world lines would also be helpful.
 
  • #20
29,562
5,885
All clocks are matter and operate based on oscillation or movement
This is simply false. Atomic clocks are a standard counter example. The hyperfine transition is neither an oscillation nor a movement.


I would claim we simply have not been able to find the common denominator does not mean it does not exist
We have found the common denominator. Relativity.

This forum is for discussion of mainstream physics as understood and practiced by professional physicists. It is not for debating alternative "claims". If you wish to learn then you are welcome, but if you wish to push your personal agenda then you should read the forum rules.
 
  • #21
TLR
10
0
This is simply false. Atomic clocks are a standard counter example. The hyperfine transition is neither an oscillation nor a movement.


We have found the common denominator. Relativity.

This forum is for discussion of mainstream physics as understood and practiced by professional physicists. It is not for debating alternative "claims". If you wish to learn then you are welcome, but if you wish to push your personal agenda then you should read the forum rules.

when stated that narrowly I would be forced to agree, however we have mass and movement and as we can see cesium clocks are far from a perfect clock.


Cesium Atoms at Work
"...till like a clock worn out with eating time."
John Dryden (1631-1701)


http://tycho.usno.navy.mil/gif/nplcaesium.gif [Broken]​
The 1955 Cesium Atomic Clock at the National Physical Laboratory, UK. It kept time to a second in 300 years.

A "cesium(-beam) atomic clock" (or "cesium-beam frequency standard") is a device that uses as a reference the exact frequency of the microwave spectral line emitted by atoms of the metallic element cesium, in particular its isotope of atomic weight 133 ("Cs-133"). The integral of frequency is time, so this frequency, 9,192,631,770 hertz (Hz = cycles/second), provides the fundamental unit of time, which may thus be measured by cesium clocks.

Today, cesium clocks measure frequency with an accuracy of from 2 to 3 parts in 10 to the 14th, i.e. 0.00000000000002 Hz; this corresponds to a time measurement accuracy of 2 nanoseconds per day or one second in 1,400,000 years. It is the most accurate realization of a unit that mankind has yet achieved. A cesium clock operates by exposing cesium atoms to microwaves until they vibrate at one of their resonant frequencies and then counting the corresponding cycles as a measure of time. The frequency involved is that of the energy absorbed from the incident photons when they excite the outermost electron in a cesium atom to jump ("transition") from a lower to a higher orbit.

According to quantum theory, atoms can only exist in certain discrete ("quantized") energy states depending on what orbits about their nuclei are occupied by their electrons. Different transitions are possible; those in question refer to a change in the electron and nuclear spin ("hyperfine") energy level of the lowest set of orbits called the "ground state." Cesium is the best choice of atom for such a measurement because all of its 55 electrons but the outermost are confined to orbits in stable shells of electromagnetic force. Thus, the outermost electron is not disturbed much by the others. The cesium atoms are kept in a very good vacuum of about 10 trillionths of an atmosphere so that the cesium atoms are little affected by other particles. All this means that they radiate in a narrow spectral line whose wavelength or frequency can be accurately determined. http://tycho.usno.navy.mil/cesium.html
as we can see however there is a lot of matter in motion that goes into making this clock work properly and they even state in their description the caveat: "Thus, the outermost electron is not disturbed much by the others." That said we have far from a perfect time piece that has to be several magnitudes more accurate to make the claim that 'time' changes rather than the devices are being interfered with by 'something'.
 
Last edited by a moderator:
  • #22
TLR
10
0
BUT IT DOESN'T CHANGE. You keep repeating the same mistaken point of view over and over. That isn't going to make it right. I can only repeat Nugatory's advice that you study the relativity of simultaneity. A look into the concept of world lines would also be helpful.
sorry but by my read dale is the only one who is zeroing in on addressing my point so far.
 
  • #23
TLR
10
0
Let's be precise about what you are expecting properly functioning clocks to do here. Say we have two identically constructed clocks, each ticking once a second based on counting the cycles of the cesium atoms buried in their guts. They are moving at a constant speed relative to one another (note that I am being careful not to say that one of them is at rest and the other is moving - all we have is their relative speed so we can equally well consider either to be the one that's moving). At the same time that my clock reads 0, your clock reads 0. After 10 ticks of my clock it of course reads 10. I look at your clock to see what it reads at the same time (which means that I have to allow for light travel time - if you are five light-seconds away, what I see of you when my clock reads 8 is what happening to you at the same time that my clock reads 3).

You are expecting that at the same time that my clock reads 10, your clock will also read 10, so that anything would indicate that one or the other clocks is somehow malfunctioning and giving "wrong" answers.

But note the vital importance of that phrase "at the same time" which I have bolded above. Are you familiar with Einstein's train thought experiment about the relativity of simultaneity? If not, google for "Einstein train relativity of simultaneity" and be sure that you completely understand what it's telling you - until you have relativity of simultaneity down cold, you will not be able to wrap your mind around time dilation.


The problem I see with this is that you are basing this upon physical imperfect devices rather than mathematical ideal construct. I believe everything starts with the flawless mathematic ideal as the standard then works backwards from there to examine the errors introduced due to physical properties, anomalies and other interactions etc.

If you could restate your point using two perfect abstract clocks that we know for a fact cannot be affected by any outside 'physical' influence of any kind, and still prove the point then I will concede.

I am saying that a perfect clock that absolutely cannot be affected by the physical disturbance whether whether those disturbances are presently known or unknown to us must be used as a reference to compare against as a starting point before we can legitimately say time changes rather than the measuring instrument.

Using perfect clocks I do not see the present day theory as it relates to bending time as a plausible answer since perfect clocks will always tick in perfect sync no matter where they are or what conditions they are being used under.

Keep in mind I am not talking about perspective, I am talking about exact tick for tick perfection, the measurement of time starts with a requirement of tick for tick perfection or fully expressed and understand anomalies before we can claim 'time' changes instead of device errors.
 
  • #24
TLR
10
0
This forum is for discussion of mainstream physics as understood and practiced by professional physicists. It is not for debating alternative "claims". If you wish to learn then you are welcome, but if you wish to push your personal agenda then you should read the forum rules.
Its not my personal agenda, its the failure of the so called professionals to posit adequate postulations.

Just now seen this, thanks for your time.
 
  • #25
berkeman
Mentor
57,469
7,486
Thread closed for Moderation...

EDIT: after a brief discussion amongst the mentors, this thread will remain closed.
 
Last edited by a moderator:

Related Threads on Trying to Explain Time Dilation

  • Last Post
Replies
10
Views
828
Replies
1
Views
2K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
14
Views
988
  • Last Post
3
Replies
52
Views
7K
  • Last Post
Replies
18
Views
3K
  • Last Post
Replies
12
Views
747
  • Last Post
Replies
17
Views
3K
  • Last Post
Replies
1
Views
2K
Top