Hi all, I was trying to understand the time dilation in special and general relativity and after much time of "overthinking" I am pretty much stuck now. My problem is, that what seems to me to be the same premises apparently imply opposite things. In special relativity, for two inertial reference frames moving relative to each other with velocity v, we have the following formula for time dilation: T' = γ ⋅ T0 where
T' is the time measured in the moving reference frame
T= is the proper time measured in the resting system
γ = 1/√1 - (v2/c2) ≥ 1
v is the relative velocity of the intertial reference frames
c is the speed of light
We see that: T' ≥ T0 We further know that "moving clocks run slow." So in the resting reference frame, the time runs faster, meaning more time passes in the resting frame relative to the moving one. So a smaller T (in this case T0) ⇒ more time passing relative to the other frame Now for the general relativity. Let's imagine a source of gravitaton, e.g. a planet, and T1 being a time interval measured close to that planet and T2 being a time interval measured further away. We have T2 = gh/c2 ⋅ T1 + T1 where
g is the gravitational acceleration
h is the distance to the center of gravity (≈ hight above ground)
c is the speed of light
We see that: T2 ≥ T1 We further know that "clocks close to gravitation run slow." So for a place closer to gravitation the time runs slow meaning less time passes relative to a place further away. So a smaller T (in this case T1) ⇒ less time passing Now how can this be? How can a smaller time interval once imply more time being passed and another time imply less time being passed. Where am I wrong? Thanks in advance