How much does Earth's time dilation decrease with distance from the surface?

Click For Summary
SUMMARY

The discussion centers on the quantification of Earth's time dilation as it decreases with distance from the surface, specifically at a height of 15 feet. The historical context includes a measurement from the 1960s experiment by Pound and Rebka at Harvard University, which indicated that a clock at 1340 feet gained 22 nanoseconds. Using this data, a calculation was made to estimate that a clock 15 feet above the ground would gain approximately 4.6 nanoseconds per second. The participant further calculated that it would take approximately 5.5 trillion years to catch up to a twin born two hours earlier while hovering at this height.

PREREQUISITES
  • Understanding of general relativity principles
  • Basic knowledge of time dilation concepts
  • Familiarity with nanoseconds and their significance in time measurement
  • Ability to perform unit conversions and simple algebraic calculations
NEXT STEPS
  • Research the Pound-Rebka experiment and its implications on time dilation
  • Learn about the effects of gravitational time dilation in general relativity
  • Explore advanced calculations involving time dilation at varying altitudes
  • Investigate modern experiments and technologies measuring time dilation
USEFUL FOR

Students of physics, researchers in relativity, and anyone interested in the practical implications of time dilation in gravitational fields.

403036387
Messages
26
Reaction score
0
Exactly how much does Earth's time dilation decrease with distance from the surface?
I know that it's way to small to ever notice and probably even mesure with today's instruments, but I'm trying to calculate how many billionths or trillionths [or less] of a second a clock might run faster if it was only about 15 feet off the ground.
 
Physics news on Phys.org
403036387 said:
Exactly how much does Earth's time dilation decrease with distance from the surface?
I know that it's way to small to ever notice and probably even mesure with today's instruments, but I'm trying to calculate how many billionths or trillionths [or less] of a second a clock might run faster if it was only about 15 feet off the ground.
Actually it was measure back in the earlu 60s if a famous experiment. It was done at Harvard University by Pound and Rebka.

Pet
 
really? that low? cool. you wouldn't happen to know how much it was...or where i could search to find it? I've been working out my own problems but seeing as i havn't even had high school physics yet, I'm not sure if it's anywhere near right.

I found somewhere where somebody at 1340 feet gained 22 nanoseconds.
so i did 1340:22= 15:x and x=4.6
so i put down ground time=1second and 15foottime=1second+4.6 nanoseconds.

what I'm really looking for in my end result is how long it would take you [how many LIFETIMES!] to catch up to your twin who was born two hours before you if you hovered 15 feet in the air.

so i found that there are 7.2x10^12 nanoseconds in two hours and i divided it by 4[number of nanoseconds gained per ground second] multiplyed by a billion to get seconds. divided by 60, 60, 24, and 365 to get 5,549,213,597,158 years to gain two hours.

i know that's a bit confusing...i apologize. is this completely wrong?
 

Similar threads

  • · Replies 79 ·
3
Replies
79
Views
5K
  • · Replies 103 ·
4
Replies
103
Views
6K
  • · Replies 9 ·
Replies
9
Views
928
  • · Replies 46 ·
2
Replies
46
Views
5K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 58 ·
2
Replies
58
Views
6K
  • · Replies 10 ·
Replies
10
Views
7K
  • · Replies 49 ·
2
Replies
49
Views
6K