# Understanding differentials in Calculus 1?

1. Jul 14, 2013

### Matriculator

I'm taking a short Calculus session this summer and the teacher zooms through things. I still don't fully understand differentials. I know that derivative give you the slope of a function at any point. And I know that dy is a small change in y and dx is a small change in x and how they can be used to approximate things. But is that all there is to it? The fact that dy is a small change in y and dx is small change in y? I noticed that they're used in integrals(we just started learning about them) when referring to what we're integrating with respect to. Why did we use dy/dx is derivative but dx in integrals? Thank you.

2. Jul 14, 2013

### micromass

Staff Emeritus
I like to make one thing very clear. There is no such thing as $dx$. It is merely a symbol. It does not represent a "very small" number, it does not represent any number at all.

The notation $\frac{df}{dx}$ is not a fraction. It is a notation. The notation $\int f(x)dx$ is also just a notation.

Do not think of $dx$ as a number. It will confuse you a big deal. And it is not how mathematics sees it. In fact, calculus doesn't work with $dx$ at all except in things like $\frac{df}{dx}$ or $\int f(x)dx$. And even there, it is just a notation.

That said, there is a way to give meaning to $dx$ as a differential form. This is done in differential geometry. But I don't want to confuse you even more.

3. Jul 14, 2013

### Matriculator

Oh wow thank you so much for this. Why don't teachers tell us this? I was initially told that dy/dx isn't a ratio, then in differentials I was told that it was and dx was a small change in x. So when I saw it in integrals I thought that we were multiplying what we were to integrate by a small change in x lol. Thank you again for the clarification.

4. Jul 14, 2013

### Bacle2

The differential of a differentiable function is the linear function that (locally) approximates the change of the function at a point x. Basically, differentiability of f at x means that _the change of f_ can be locally described by a linear function. As an example, for f(x)=x2, we have that f'(x)=2xdx , which means that the (linear) function 2x approximates as closely as you want ( in a precise δ-ε sense), the change of f(x)=x2 near a point, and the change of f(x)=x2 along a small amount dx of x can be approximated as well as you want (δ-ε) by making the change in x as small as you want--check it out by plugging-in small changes in x, and comparing the change of x2 between x and x+Δx as Δx becomes small, and the change of 2x in the same interval.

The symbol dy stands for the change _along the tangent line to y=f(x)_ , while dx r
epresents the change along the x-axis. OTOH, Δy , Δx represent the _actual changes_ and not the linearized ones. If the limit exists, then the derivative, and so the local-linear approximation, also exists.

5. Jul 14, 2013

### micromass

Staff Emeritus
I think it's very wrong to teach $dx$ as a small change. I understand why they do it, but I think it's a disservice to the students.

I assume you've seen limits before? Fix a point $a$. We can define $\Delta x = x-a$ and $\Delta f = f(x) - f(a)$. This is a historic notation. Then we define

$$\frac{df}{dx} = \lim_{x\rightarrow a} \frac{\Delta f}{\Delta x}$$

This explains the notation for a derivative. So the $dx$ is not a small change in itself, but should be seen a limiting value of small changes.

When doing things intuitively, it is not wrong to think of $df$ and $dx$ as very small infinitesimal changes. This is a useful thing to do in physics. But I wish to stress that this is not at all rigorous, and not at all correct. It's a very good intuition, but incorrect. (Yes, incorrect things can be good intuition)

6. Jul 14, 2013

### verty

To find the slope of a curve, we must find the slope of the tangent to the curve. The slope of the tangent = $\frac{dy}{dx}$, the limit of $\frac{Δy}{Δx}$.

Now suppose we allow infinitesimals, then there is a question of what the angle is between a tangent and a curve. It may be infinitesimal, in which case there is an infinity of tangents with different infinitesimal angles. Which one is THE tangent? Which do we use to define $\frac{dy}{dx}$? Radius of curvature is problematic too, we can't say what the curvature of a curve is.

So there are major problems to be overcome with infinitesimals. Mathematics moved on without them, although in the previous century they were reborn as Non-Standard Analysis.

7. Jul 14, 2013

### WannabeNewton

The notion of $dx$ being a "small change" is something that is prevalent in physics texts; it is used because it helps with calculations and derivations. It is by no means rigorous (far from it) so take it with a grain of salt. There is a mathematically precise definition of what $dx$ is from a geometric point of view but don't worry about that for now.

8. Jul 16, 2013

The differentials take on an actual value if you define or assign a value to the other differential and the derivative right?

If I have $$y=x^2$$ then $$\frac{dy}{dx}=2x$$ and then $$dy=2x dx$$ then you have $$dy$$ in terms of the derivative and the differential.

9. Jul 16, 2013

### lurflurf

^Right given y=x^2 we define two functions (R^2->R)

Δy=(x+dx)^2-x^2=2x dx+dx^2
and
dy=(x^2)'dx=2x dx
so
Δy-dy=dx^2

so dx can be a small number, it can also be a large number
often we write Δy~dy
that is they are equal in some sense
that sense is locally
if dx is large they are not close numerically
if dx=10^-6
Δy-dy=dx^2=10^-12
if dx=10^6
Δy-dy=dx^2=10^12

A geometric interpretation is that the differential defines the tangent line.
The tangent line closely matches the function for small dx.
The tangent line exists for large dx, but we do not expect a close fit.

10. Jul 17, 2013

### verty

No one should write Δy ~ dy, this is wrong because it implies dy = 0. That is not true. And it is wrong in an irreconcilable way. DY is not an infinitesimal, forget this idea. It is a symbol that has meaning only in context, like when an integral sign precedes it.

$dy = 2x dx$ is short for $\frac{dy}{dt} = 2x \frac{dx}{dt}$ for some implicit variable $t$. It means, the rate that y changes is 2x * the rate that x changes.

I'm reporting the above post for being misleading after having been told it was so.

11. Jul 17, 2013

### micromass

Staff Emeritus
Verty, I absolutely agree with you. The notation is extremely misleading and wrong. It does a great disservice to the students who wish to learn calculus.

However, it appears that some calculus books, like Stewart, do teach things this way. I personally think that Stewart is a horrible book and should never be used in a classroom. But it is unfortunately a standard book.

I can only ask the OP to ignore the misleading post.

12. Jul 17, 2013

### lurflurf

Verty, I absolutely disagree with you. Why are you talking about infinitesimals? dy is some number. It has a meaning without integral signs. You then go on to use it yourself. I write Δy ~ dy to avoid confusion with global equality, the equality is local. Though it would be correct to use = as others do. It is rather important to not think of dx as needing to be small, though that is often the region of interest.

You are confused. See for example
http://en.wikipedia.org/wiki/Differential_of_a_function
or
any calculus book ever

13. Jul 17, 2013

### micromass

Staff Emeritus
The mathematical rigorous definition treats $dx$ and $df$ as functions. Not as numbers. See any differential geometry book and many analysis books, for example Spivak's calculus on manifolds (or the very wiki you linked).

I propose we start referring to rigorous math books on the topic, and not wiki links. Any confused calc student can edit a wiki link.

I don't care what rubbish books like Stewart use. If they set the standard for mathematical concepts nowadays, then it's a sad world.

14. Jul 17, 2013

### WannabeNewton

If you open up a book on thermodynamics, you'll see lurf's usage of the differential quite commonly. It is just a difference in the usage of the differential; it is often simply convenient to think of it in that way because it works brilliantly in calculations. The rigorous definition of the differential is quite useless at that stage and not to mention much harder to visualize than the notion of the differential as a "small quantity of such and such". As long as the student knows the difference between a non-rigorous but highly effective computational tool and a rigorous definition of the differential, I don't see immediate harm. If everyone was rigorous all the time then much of the practical stuff would never get done. As wiki puts it: the precise meaning of the variables dy and dx depends on the context of the application and the required level of mathematical rigor.

I find it a bit contentious to report someone for something that actually exists in various texts. It's not like he/she is pulling things out of a hat.

15. Jul 17, 2013

### micromass

Staff Emeritus
Let's keep things on-topic here and let's not resort to insults.

16. Jul 17, 2013

### Stephen Tashi

It's an interesting cultural phenomenon that certain topics that don't pass muster as logical reasoning are traditionally given a pass in discussing mathematics. Many discussions about statistics illustrate this. Students get a slap on the hand for a wrong epsilon-delta proof of $Lim_{x \rightarrow 1} 2x + 1 = 3$, but differentials and infinitesimals slip by without similar scrutiny. Proofs of trigonometric identities are tradictionally written backwards. Many texts state definitions in an "if...then..." form when they actually mean "if and only if". (e.g. "If the group operation is commutative then we say that G is an abelian group.")

The presence of differentials and infinitesimals in textbooks is a strong argument that these topics are important from a sociological point of view - given that mathematics as practiced by humans is not a completely logical adventure.

However, this doesn't change the slightly embarassing fact that teachers expect one standard of rigor when they ask a student "What is the definition of a derivative of a function?" and a completely different standard if they ask "What is the definition of a differential"? In the math sections of the forum, I think it's appropriate to make it clear that differentials, from a calculus 1 perspective, do not have a precise definition. The fact that many people are eager to weigh-in on threads like as "Are dx and dy numbers?" "Is dy/dx a ratio?" indicates that many people wish to be helpful (and that many of them have strong private opinions), but it shouldn't be taken to mean that there is actually a precise and logical definition and development of differentials in Calculus I.

17. Jul 17, 2013

### lurflurf

Lets not be needlessly pedantic. Something can be a function and a number like 5. Clearly there is a slight abuse of notation in declaring a variable is a function of another or using f(x) to denote both the value of f at x and the function f itself. It presents no problem. In fact a great advantage of differentials is that they are coordinate invariant. One can learn a lot by looking though 2^7 or so calculus books. The good ones and bad ones both define differentials. This is a frequently misunderstood topic for reasons I don't understand. I think the explanation in most books is perfectly understandable and logical. The differential is a linearization of the function determined by the function in the neighborhood of a point. This fits quite well with the notion in differential geometry, though often differential forms are used to make higher differentials coordinate invariant by making them zero. I often tell confused calculus students to try reading a book one level up, but I think in many cases reading differential geometry books might lead to more confusion. Stewart might be better book than I gave it credit for. Maybe Micro can make a ten best and ten worst calculus book thread.

18. Jul 17, 2013

### Stephen Tashi

I don't understand the population from which the statistic is taken. Are you talking about most Calculus I books?

I don't think anyone disputes that differentials can be given a rigouous definition in an advanced context. It isn't being excessively pedagogical to convey to a Calculus I student that differentials are not precisely defined in that course - a student shouldn't worry that a page was torn out of his book.

Can we give a Calculus student a version of the rigorous definition that provides a unified view of all the various situations where differentials appear, $dy/dx , \int f(x) dx$, "total differential", "area element"? I'm not posing this question as a debating point. I'm actually curious how it could be done.

19. Jul 17, 2013

### bolbteppa

This video

If an author as esteemed as Gelfand is willing to make multiple approximations of the kind $\Delta y \approx dy$ in, for example, his proof of Noether's theorem in a calculus of variations context then approximations of this kind are just something you'll have to live with. Constructing things in terms of differentials rigorously is methodical worker-bee stuff that merely codifies the intuition of geniuses seeing approximations like these, & acts as a means to nullify incorrect intuition, so I would say eschewing the intuitive idea is probably far more harmful than good, it's a tool one should become proficient in.