# Paradoxical definition of the Derivative

1. ### danne89

181
A common definition I've read: A derivative of a arbitrary function is the change in one given instant.

It's hard to think about it. I mean, movement, which is change of position cannot be defined for zero time.

Has Zeros paradox something to do with this? If I don't misstaken, Zeros paradox is about the sum of infinity many parts can be finite.

Very confusing... :grumpy:

2. ### matt grime

9,396
Or it could be that your "common definition" is just an illustrative explanation and not acutally a rigorous definition at all.

3. ### danne89

181
Ahh. I see. But how does Zenos paradox relate?

4. ### arildno

12,015
It doesn't.

Note that in the "ordinary" interpretation of Zeno's paradox, that paradox is RESOLVED by noting the fact that, say, an infinite number of terms may add up to something finite.

5. ### HallsofIvy

40,504
Staff Emeritus
That's exactly WHY we need a rigorous definition for the derivative!

In fact, that's exactly what led Newton to develop the calculus! He wanted to show that the gravitational force was dependent on the distance from the mass (and, of course, the acceleration due to that force). But the distance could (in theory anyway) measured at any instant while neither speed nor acceleration could, without calculus, be DEFINED at a given instant. The fact that "speed at a given instant" (and, therefore "acceleration at a given instant") could not even be defined was really Zeno's point and the calculus was a way to do that.

Arildno, Zeno had several different "paradoxes". You, I think, are thinking of the one about "You can't cross this distance because you would first have to cross 1/2 of it, then 1/4 of it, then 1/8 of it, etc." danne89 was probably think of the "arrow" paradox: "At any given instant, the arrow is AT a specific point and therefore, not moving! Since it is not moving at any given instant, it can't be moving at all."

Last edited: Feb 24, 2005
6. ### danne89

181
Nice. But as I've heard it, Newton's definiton wasn't rigorous.

7. ### dextercioby

12,314
Not in the terms of modern analysis.At that time,IT WAS RIGUROUS ENOUGH TO DELIVER THE CORRECT THEORETICAL RESULTS...Namely explaing the laws of Kepler.

Daniel.

8. ### danne89

181
So todays rigurous definition may turne out to be non-exact tomorrow, when new physic demands avaible?

9. ### dextercioby

12,314
Yes,mathematicians like to think they haven't discovered everything... Anyways,the basis cannot change.I suspect point set topology will be the same in the next 5000 years...

Daniel.

10. ### mathwonk

9,703
the derivative is not , even in a roiugh intuitve sense, the change of the function in zero time. rather it is an approximation to the change of the function in UNIT time.

to compute by extrapolation the change of the function in unit time, you take the change in time delta(t) and divide that change by delta(t).

if delta(t) is very small, this result will extrapolate the change over a small interval to a change over a unit time interval.

we do this for a sequence of smaller and smaller injtervals, and then try to guess what number these results are tending toward.

11. ### matt grime

9,396
No, today's rigorous definitions are rigorious definitions within th current rigorous definition of "rigorous definition". Newton's weren't rigorous at the time. But this is a sign of the way the philosophy of mathematics has changed. What constituted a proof to Gauss, Euler, and even Galois often wouldn't pass muster in modern mathematics. That isn't to say their proofs were incorrect, but that there were some gaps, small ones, that they glossed over, or ignored.

12. ### danne89

181
Conclusion: I'll look it up in a more rigurous book...