Making Sense of Differentials

  • #1
479
20
When I first came across differentials, I was told that they could be thought of as infinitesimal changes. However, I can't get my head around how they're actually used to model physical problems. For example, if ##x## is the x-coordinate of a moving body, then ##dx## is an infinitesimally small change in position. More generally, if ##\vec{r}## is the position vector of a body, then ##d\vec{r}## is an infinitesimally small change in position. What I don't understand is how we think of things like distance, mass, and area in terms of differentials. For example, we think of ##dA## as an infinitesimally small area; ##dm## as an infinitesimally small mass; and ##ds## as an infinitesimally small distance. Why do we avoid the concept of "change" when talking about mass, area, and distance (to name a few)?
 
  • #2
I was first introduced to the concept of differentiation at the age of 16 when taking my first calculus class. We were not told what ##\frac {dy}{dx}## actually meant, we were just told 'tricks and tips' so to speak for dealing with differentiation and integration. i.e. 'When you differentiate a polynomial, reduce the powers of x by 1, and multiply by the old power.' was the first thing we were taught. It wasn't until I was introduced to differentiation during mechanics and physics modules that the concept was explained in the terms you have described it. It may help to think of ##dA##, ##dm##, ##ds## as infinitesimally small 'amounts' of something, rather than changes. Can you provide an example of where you believe the concept of 'change' has been abandoned?
 
  • #3
That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.
Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.
 
Last edited by a moderator:
  • #4
. Why do we avoid the concept of "change" when talking about mass, area, and distance (to name a few)?

I'm not sure what you are asking. "Change" of things is often dealt with in physics. Is the question: "Why do we use expressions where symbols like [itex] dy [/itex] appear as ordinary variables instead of only as part of a fraction like [itex] \frac{dy}{dx} [/itex] ?"

If you browse contributions to old threads concerning differentials, you won't find a unanimous point of view - even from the experts who generally agree on other aspects of mathematics. I think differentials have a standard definition in differential geometry, but there is no standard definition in elementary calculus. The treatment of differentials in calculus varies from textbook to textbook. Physics texts often reason with differentials without establishing any rigorous definition for them.
 
Last edited:
  • #5
That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.
Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.

I write the following assuming this post was a reply to my own.

I had calculus classes for two years before university, using the 'tricks' we had been taught. I had trouble with the concept of differentiation and integration for that time as it had not been properly explained. I suppose the teaching of differentiation followed the order it was developed by Newton and the concept of the limit was not mentioned until I reached university. It was then that we learned differentiation from First Principles in our first week of calculus. After that first week the differentiation and integration that had been a slight problem for me before, was now 'simple', so to speak. I felt like I had a much deeper understanding of the calculus I had learned over the past two years and more confident in my ability to differentiate functions more complicated than polynomials. I feel as though the concept of a limit should have been introduced at the earliest point, and then once we had understood the groundwork for what we were doing we could move onto learning the 'tricks' that help differentiate and integrate more easily. Although, you could argue that the two years of study before university gave me the foundation to appreciate the further insight gained from my introduction to the limit notation and the explanation of differentiation from First Principles.


Anyway, back to the topic at hand. OP I suggest reading 'Zero, the biography of a dangerous idea' by Charles Seife. The book focuses around the concepts of 0 and ∞, building up to an explanation of the development of calculus, and the different notation used. The passage also explains how Newton developed the idea (and his notation) and how Newton's ideas differed slightly from those of Leibniz. It then goes on to explain a little about the 0/0 problem and l'Hôpital's rule. The book is popular science and as such is easy to digest, if a little slow at getting to what it's trying to convey.
 

Suggested for: Making Sense of Differentials

Replies
15
Views
1K
Replies
5
Views
998
Replies
8
Views
1K
Replies
1
Views
967
Replies
4
Views
1K
Replies
2
Views
3K
Replies
9
Views
4K
Back
Top