Hi, so I think I have a problem with how I am thinking about time dilation anyway here it is so if two events occur at the same position say light beam going up then back down and this occurs in frame S', this would be the proper time right? the interval between the two events. Now the equation for time dilation is y = gamma factor Δt = y Δto so that means if I was in frame S and watched from S' go by me at very fast speeds... then the the time interval I would measure would increase by the y factor. Now assuming what all I said now is correct. Would that not mean that I am seeing their time go by faster? Say in S' the proper time between the event be 2...seconds? now say y = 2... and now in S, me I would see the whole entire thing happen in Δt = 4 seconds So if I see the whole thing go by in 4 seconds doesn't that mean when I look at their clock, their time is going by faster? Everywhere I read, it says moving clocks run slower... help? THANKS for any reply!