Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

When can I apply the idea of differentials?

  1. Jan 24, 2010 #1
    My calculus book says sometimes derivatives can be regarded as the ratio of differentials, and sometimes they can't. Apparently, there's a similar rule for integrals. When can I think of derivatives and integrals as operations with differentials? And when can't I?
     
  2. jcsd
  3. Jan 29, 2010 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    What exactly does your book say? What are its exact words. While, strictly speaking, derivatives are NOT ratios, I can't think of a case in which they cannot be regarded as ratios of differentials.
     
  4. Jan 29, 2010 #3
    The slope of a tangent line can be thought of as the deviation of y, dy, for a given deviation of x, dx. This is a ratio.
     
  5. Jan 29, 2010 #4
    Actually, now that I think about it, aren't all derivatives the ratio of differentials? I say this based on the definition of differentials:

    dx = (dx/dy)dy, where (dx/dy) is the derivative of x with respect to dy. Therefore:

    dx/dy = (dx/dy). What do you think?

    PS. 1: I think this whole differential business is one big mess because every author seems to be afraid to address the subject in a clear way. The consequence is that students like me have to go through great lengths in order to try and understand whether differentials are, or aren't, a legitimate concept of mathematics which can be employed without fear.

    PS. 2: Thank you all for your replies.
     
  6. Jan 29, 2010 #5
    originally differentials were considered to be small deviations of the variables and their ratio was an approximation to the derivative. The question was - given a very small deviation in one variable what is the deviation on the other provided these deviations are small - where small means small enough to give a good linear approximation - much like a regression.

    So for instance if x^2 - y =2 then at the point (1,1) 2dx - dy = 0 meaning twice the small deviation in x is the small deviation in y, approximately - where "approximately" means that this gives the best linear relationship for small enough deviations. I always think of differentials as small deviations and in Physics that is how they are thought of.

    In modern mathematics differentials are thought of as linear 1 forms that map tangent vectors into tangent vectors. This is a formalism that obscures this fundamental intuition.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook