- #1

- 479

- 20

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter PFuser1232
- Start date

- #1

- 479

- 20

- #2

- 4

- 0

- #3

HallsofIvy

Science Advisor

Homework Helper

- 41,833

- 963

That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.

Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.

Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.

Last edited by a moderator:

- #4

Stephen Tashi

Science Advisor

- 7,634

- 1,492

. Why do we avoid the concept of "change" when talking about mass, area, and distance (to name a few)?

I'm not sure what you are asking. "Change" of things is often dealt with in physics. Is the question: "Why do we use expressions where symbols like [itex] dy [/itex] appear as ordinary variables instead of only as part of a fraction like [itex] \frac{dy}{dx} [/itex] ?"

If you browse contributions to old threads concerning differentials, you won't find a unanimous point of view - even from the experts who generally agree on other aspects of mathematics. I think differentials have a standard definition in differential geometry, but there is no standard definition in elementary calculus. The treatment of differentials in calculus varies from textbook to textbook. Physics texts often reason with differentials without establishing any rigorous definition for them.

Last edited:

- #5

- 4

- 0

That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.

Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.

I write the following assuming this post was a reply to my own.

I had calculus classes for two years before university, using the 'tricks' we had been taught. I had trouble with the concept of differentiation and integration for that time as it had not been properly explained. I suppose the teaching of differentiation followed the order it was developed by Newton and the concept of the limit was not mentioned until I reached university. It was then that we learned differentiation from First Principles in our first week of calculus. After that first week the differentiation and integration that had been a slight problem for me before, was now 'simple', so to speak. I felt like I had a much deeper understanding of the calculus I had learned over the past two years and more confident in my ability to differentiate functions more complicated than polynomials. I feel as though the concept of a limit should have been introduced at the earliest point, and then once we had understood the groundwork for what we were doing we could move onto learning the 'tricks' that help differentiate and integrate more easily. Although, you could argue that the two years of study before university gave me the foundation to appreciate the further insight gained from my introduction to the limit notation and the explanation of differentiation from First Principles.

Anyway, back to the topic at hand. OP I suggest reading 'Zero, the biography of a dangerous idea' by Charles Seife. The book focuses around the concepts of 0 and ∞, building up to an explanation of the development of calculus, and the different notation used. The passage also explains how Newton developed the idea (and his notation) and how Newton's ideas differed slightly from those of Leibniz. It then goes on to explain a little about the 0/0 problem and l'Hôpital's rule. The book is popular science and as such is easy to digest, if a little slow at getting to what it's trying to convey.

Share:

- Replies
- 8

- Views
- 4K