- #1

- 1

- 0

I was trying to understand the time dilation in special and general relativity and after much time of "overthinking" I am pretty much stuck now. My problem is, that what seems to me to be the same premises apparently imply opposite things.

In

__special relativity__, for two inertial reference frames moving relative to each other with velocity v, we have the following formula for time dilation:

T' = γ ⋅ T

_{0}

where

- T' is the time measured in the moving reference frame
- T
_{=}is the proper time measured in the resting system - γ = 1/√1 - (v
^{2}/c^{2}) ≥ 1 - v is the relative velocity of the intertial reference frames
- c is the speed of light

We see that:

**T' ≥ T**

_{0}We further know that "moving clocks run slow."

So in the resting reference frame, the time runs faster, meaning more time passes in the resting frame relative to the moving one.

So

**a smaller T (in this case T**

_{0}) ⇒ more time passing relative to the other frameNow for the

__general relativity__. Let's imagine a source of gravitaton, e.g. a planet, and T

_{1}being a time interval measured close to that planet and T

_{2}being a time interval measured further away. We have

T

_{2}= gh/c

^{2}⋅ T

_{1}+ T

_{1}

where

- g is the gravitational acceleration
- h is the distance to the center of gravity (≈ hight above ground)
- c is the speed of light

We see that:

**T**

_{2}≥ T_{1}We further know that "clocks close to gravitation run slow."

So for a place closer to gravitation the time runs slow meaning less time passes relative to a place further away.

So

**a smaller T (in this case T**

_{1}) ⇒ less time passingNow how can this be? How can a smaller time interval once imply more time being passed and another time imply less time being passed. Where am I wrong?

Thanks in advance