I'm a grade 11 student (I haven't formally learned calculus yet, but I've been dabbling with online tutorials) with a question (not for homework, just something I've been thinking about). For relatively small distances from Earth, we can estimate that the acceleration of a falling object will be constant (9.8 m/s²). However, that is obviously not the case once you start getting farther away. How can we describe the motion of an object falling to Earth from very far away? Does it have a constant jerk (change in acceleration), or does that also change? This is related to a question I read on this forum (https://www.physicsforums.com/showthread.php?t=99555).