We know that the proper time between two events is the shortest possible time between those two events that can be measured in any frame. This follows from the idea that moving clocks run slow-- a stationary clock at rest in S' which moves relative to S at a constant speed v will be time dilated to run slow, and from the Lorentz transformation it follows immediately that if two events happen at the same location in S', all other frames require MORE time than the proper time measured in S' between those two events (which is, quantitatively γ*Δt'). Now, I've also heard that in general relativity, the proper time of a clock is the time measured by a clock along a geodesic-- that is, the time measured by a clock which has MAXIMIZED its proper time. Is it strange that in SR, proper time is the smallest possible time between events while in GR the proper time is maximized by a clock which experiences no non-gravitational forces? How are these two ideas compatible? Why does one minimize while the other maximizes proper time?