- #1

- 333

- 1

The object is moving in a 3D world.

Now, let S

_{t1 - t2}denote the displacement vector from moment t1 to moment t2.

Now, let's say that t0 is the moment of the begining of the motion, and (tf) is the last moment of movement.

We can split the time from t0 to tf into small bands, each x seconds long.

Now, let's add up the

**length**of the vectors |S

_{t0 - t0+x}| + |S

_{t0+x - t0+2x}| + |S

_{t0+2x - t0+3x}| + .... + |S

_{(something) - tf}| = Y

Now, it is obivous (at least for me) that if you make x smaller and smaller (x->0) then the value of Y will get nearer and nearer to the

**Distance**passed by the object.

First of all, am i right ? Secondly (if so), how can it be prooved ?

Thanks !