Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Derivative as a Fraction

  1. Mar 13, 2012 #1
    I know that you cannot view [itex]\frac{dy}{dx}[/itex] as a fraction, with dy and dx meaning infinitesimal changes in their respective variables, but what I do not understand is why not.

    Can anyone explain to me why you can't (correctly) view the derivative as a fraction?
     
  2. jcsd
  3. Mar 13, 2012 #2

    tiny-tim

    User Avatar
    Science Advisor
    Homework Helper

    Hi Vorde! :smile:

    Short answer: it isn't a fraction, but in many circumstances you can treat it as a fraction (eg the chain rule, and converting dy/dx = xy to dy/y = xdx) :wink:

    (Is anything particular worrying you about that?)

    (someone else can give the long answer o:))
     
  4. Mar 13, 2012 #3
    No nothing is particularly worrying, I'm not having trouble or anything, I just don't see why it can't be.

    Thanks for the help though :)
     
  5. Mar 14, 2012 #4
    You CAN define it as a fraction of infinitesimal quantities. This is, in fact how calculus was originally done by Newton and Leibniz; the original name for calculus was "the infinitesimal calculus". Then in the 1800's people felt that infinitesimals were too vague and nonrigorous, so they invented the notion of limits to provide a more solid foundation for calculus. But then in the twentieth century, Abraham Robinson, Edward Nelson, and others were able to put infinitesimals on a more rigorous foundation called nonstandard analysis, thus providing some justification for the work of Newton and others.

    Personally, I feel that it is better to get acquainted with calculus first with a nonrigorous infinitesimal approach to get a good intuition, e.g. by reading Silvanus Thompson's fantastic book Calculus Made Easy, written over a century ago. Then you can learn the subject from a more formal textbook. There is actually a textbook by Keisler that teaches calculus in a rigorous way using infinitesimals, and it's available for free online courtesy of the author: http://www.math.wisc.edu/~keisler/calc.html
     
  6. Mar 14, 2012 #5

    arildno

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    lugita:
    It is a common belief that Newton thought of the derivative as some sort of ratio between infinitesemals.
    This is completely mythical, however, and I cite the following from Principia:

    "It may also be objected, that if the ultimate ratios of evanescent quantities are given, then their ultimate quantities are also given (...). But this objection is founded on a false supposition. For those ultimate ratios with which quantities vanish are not truly the ratio of ultimate quantities, but limits towards which the ratios of quantities decreasing without limit do always converge, and to which they approach nearer by than by any given difference , but never go beyond, nor in effect attain to, till the quantities are diminished in infinitum."
    http://books.google.no/books?id=lSo...eKQ4gSglbW1Dg&redir_esc=y#v=onepage&q&f=false

    Those are Newton's own words, at page 39, in Motte's translation, revised by Florian Cajori, the celebrated historian of mathematics.

    Andrew Motte's original 1729 text is here, at page 56:
    http://books.google.no/books?id=b3R...edir_esc=y#v=onepage&q=ultimate ratio&f=false



    Thus, we see that Newton was basically having our "modern" limiting process in mind, rather than Leibniz' idea of ratios between infinitesemal quantities.

    Bolzano, Cauchy, Weierstrass et al just hammered out the details of Newton's original view..
     
    Last edited: Mar 14, 2012
  7. Mar 14, 2012 #6
    It's true that Cajori's translation give that impression, but that is simply wrong. Cajori is simply putting his modern gloss on Newton's words. I can read Newton's Principia in the original Latin, and I feel Motte's translation, excluding a few notable errors, is far more faithful to Newton and Cajori's revisions are for the most part unnecessary. And Cajori has the annoying habit of seizing upon Motte's words whenever they happen to be the same or similar words used in modern parlance, emphasizing them and repeating them to make it sound as if Newton was speaking in a more modern tradition.

    When Newton is discussing evanescent ratios, he does not simply mean finite quantities whose magnitude becomes smaller and smaller. He means quantities which start out being finite but then become actual infinitesimals, i.e. less than any finite number but not equal to zero. Look at page 51, for instance: he says the "distance GF may be less than any assignable", so he concludes that "the ratio of AG to Ag may be such as to differ from the ratio of equality [i.e. 1] by less than any assignable difference". And if you read in more detail, you will find clear discussion of his view that magnitudes come in three types: infinitely small, finite, and infinitely great.

    Finally, it should be noted that a lot of the confusion first-time readers of Newton have originates from the fact that Newton did not believe in real numbers. Historically, numbers were thought to only encompass natural numbers. Even fractions were just thought to be relations between numbers, and what we would consider real numbers were just relationships between geometric quantities. If you had two equal distances AB and CD, then the ratio of AB to CD would be called the "ratio of equality", which to them was quite a different concept from the number 1. The modern idea of real numbers (the idea, not the formal rigorous definition) originated with Rene Descartes, and probably the first serious mathematician to use it was Leibniz. That's why Leibniz's notation makes so much sense to us, whereas Newton seems to be speaking an alien geometric language going back to Euclid.
     
  8. Mar 14, 2012 #7

    arildno

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    1. "Cajori is simply putting his modern gloss on Newton's words. "
    I don't see that motte's text at that place diverges from Cajori's at any substantial point.

    2. As for "less than any assignable" difference seems to me, at the most, that Newton didn't regard is particularly "meaningful" to talk about what sort of "magnitudes" remained, but that what mattered for calculational purposes, one might as well tret them as..0. This is NOT the same as declaring the actual existence of some numbers whose magnitudes ARE infinitesemal, in the way Leibniz did.
    Just that Newton refuses to take a position on that point.
    In this manner, he proceeds as the pragmatic physicist, and discards what he thinks of as "useless speculations".
    That he therefore can be justly charged of not having "thought through" what he actually has been saying, in a formalized, ultimately logically consistent manner is probably true, and you have made many interesting points here that I was unaware of, and has been significant in shifting my view somewhat. Thank you!

    Yet to me, Newton seems to hover, in an informal, brilliant manner, bridging the gap between the Greeks and us moderns, probably as alien to them as to us, throwing out various ideas that ultimately bore fruit. Leibniz "cleansed" some of newton's thinking by biting the bullet of the "infinitesemals", discarding the conception of the limiting process, while Newton himself wasn't quite ready to take the step away from the mingling of those two ideas.

    Or, something like that. He was an oddball. :smile:
     
  9. Mar 14, 2012 #8
    It's true that Newton thought that for calculational purposes you can just plug in zero wherever you saw an infintesimal, because he thought that infinitesimal magnitudes were definitionally infinitely small, so they can be neglected or added at will. But as I said, if you read through the Principia you will find that Newton does believe that there are such things as infinitely small magnitudes, finite magnitudes, and infinitely large magnitudes.
    We have to careful here. There is indeed a sense in which Newton did not believe in infinitesimal numbers and Leibniz did. But that's only because Newton believed that only the natural numbers were numbers, and everything else we would consider numbers were just ratios, or relations, to him, and thus you couldn't just manipulate them algebraically willy nilly. As I said, he believed this even about what we would today call √2 and pi. But he did believe you can have things like infinitely small lengths and infinitely small ratios of lengths.

    Incidentally, you may be interested to know that since they didn't believe that ratios were numbers, it was very difficult to do things like add, subtract, or compare them. For this purpose, they would use the Eudoxian theory of proportions described in Euclid's Elements, which in modern language would say that two real numbers are equal if the same rational numbers are less than and greater than them, which was the inspiration of Dedekind's construction of the real numbers. I just found a brief explanation here.
    As I said, I think neither Newton or Leibniz conceived of the limiting process as staying finite, like we would today. I think the modern idea of a finite limiting process originated as a response to Bishop Berkeley's scathing attack in the Analyst, which rightly criticized the lack of rigor of both Newton and Leibniz.
     
  10. Mar 14, 2012 #9
    Nothing could be further from the truth. Newton was well aware of the logical difficulties of defining the limit of the difference quotient as a fraction. Over his career he tried several different approaches, without success, to try to make the derivative logically sound.

    It's true that mathematicians didn't have the tools to rigorously define the derivative. But it's NOT true that they were unaware of the problem. At least in Newton's case we can document his unsuccessful struggles trying to make logical sense of the limit of the difference quotient.

    Bishop Berkeley (British pronounciation "Barkley" like the basketball player) famously called dx and dy the "ghosts of departed quantities." Pretty good line, don't you think?

    http://en.wikipedia.org/wiki/The_Analyst
     
  11. Mar 14, 2012 #10
    I beat you to the Berkeley reference! :smile: I agree that Newton saw some of the logical difficulties with derivatives, and how in some sense it was like a trick to get away with dividing by zero; he discusses and responds to this issue in one scholium in the Principia. But I think he also had a clear notion that not all magnitudes were finite.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Derivative as a Fraction
  1. Fractional derivative (Replies: 8)

  2. Fractional Derivatives (Replies: 3)

Loading...