# What's the purpose of Epsilon proofs for limits?

flyingpig
In all the problems I have done so far, the limit was already given. So the goal is to utilize the theorem to see whether the limit really holds.

But what's the point? If we already know how to find the limit, why must we go through a process of ingenuity algebra to tell ourselves, "okay it works and it only took us [insert a very long time]"

Are there people who are that skeptical? Yes I realize the math was created before computers, but what's the point of having it now? If you seriously need a theorem with fancy words and symbols to confirm that your limit is right, doesn't that mean you are just unsure of what you are doing?

I mean I finally understood how to use Epsilon definition (and it's true meaning thanks Mark44), but it looks like I am just computing limits backwards with this method

Number Nine
Epsilon-delta is the definition of a limit. All those handy rules you learned were derived from epsilon-delta arguments. It's somewhat important.

flyingpig
Epsilon-delta is the definition of a limit. All those handy rules you learned were derived from epsilon-delta arguments. It's somewhat important.

I thought the limit laws were, but not actually the results.

If you seriously need a theorem with fancy words and symbols to confirm that your limit is right, doesn't that mean you are just unsure of what you are doing?

Yes, it does. And being unsure is the proper outlook if what you are doing is guessing limits "just by looking at them" or using numerical computations. If people could do that reliably then there would be no need for rigorous proofs. Experience has shown that people cannot reliably determine limits "by inspection" or by computation. Hence the need for proofs. Teaching material does use simple examples. Most textbook exercises ask students to solve things that the World has already solved. That doesn't imply that the solution techniques aren't needed to do more sophisticated problems.

homeomorphic
Epsilon-delta is the definition of a limit. All those handy rules you learned were derived from epsilon-delta arguments. It's somewhat important.

They were derived that way in the sense that someone did it that way (Weierstrass or Cauchy or someone like that), but they were not INVENTED that way. We had most of those rules before we had epsilons and deltas.

He's quite right to ask what the point is. In fact, it took a lot of effort on Weierstrass's part to convince the mathematical community that it was needed. For a long time after calculus was invented, no one thought of epsilons and deltas. Cauchy started doing things almost that way, maybe early 1800s, and Weierstrass put it in the form that we see today a little later. Calculus was invented in the 1600s! It wasn't at all obvious that epsilons and deltas were needed back then, and it isn't any more obvious to students studying calculus today. In no way should we just take it for granted that epsilons and deltas are needed because my daddy said so and my daddy is a mathematician.

I suggest reading A Radical Approach to Real Analysis. You don't have to read the whole thing, but the first couple chapters get into this. You might take the history with a grain of salt, but I think the thrust of it is right.

Part of the reason for using the formal definition is to put things on a firmer footing (maybe more for proving more results than for showing things that were previously pretty clearly true, even without 100% rigor). But, maybe a stronger reason in some ways, is that mathematicians ran into problems when just using their intuition. They ran into situations where it wasn't clear what was going on, so they had to be a bit more careful in order to sort it out. Things like differentiating series term by term. When is it okay to do that? In a large number of examples, it doesn't cause any trouble. But, if you look at the right example, you start running into problems if you aren't rigorous. Things go wrong. So, you have to analyze what exactly goes wrong and what goes right when it does work. There's a reason to be skeptical. You can find examples that FORCE us to be so skeptical.

And we may not have been so sure of ourselves before, but now equipped with our epsilons and deltas, we CAN be sure of what we are doing.

So, it's a tool for exploring new territory, and, for dealing with some tricky problems, it can be a necessity.

I personally don't believe in introducing epsilons and deltas in ordinary introductory calculus class. It's a topic for math majors or for other people who just happen to need it.

Are there people who are that skeptical? Yes I realize the math was created before computers, but what's the point of having it now? If you seriously need a theorem with fancy words and symbols to confirm that your limit is right, doesn't that mean you are just unsure of what you are doing?

Consider the integral
$\int_0^\infty \cos(2x) \prod_{n=1}^{\infty} \cos(x / n) \, dx$
0.392699081698724154807830422909937860524645434187231595926812285162...
Get out a calculator and find out $\frac{\pi}{8}$. Notice something?

In fact, it took a lot of effort on Weierstrass's part to convince the mathematical community that it was needed.

Oh, definitely. Specifically it required his http://en.wikipedia.org/wiki/Weierstrass_function" [Broken]. It's a pitty we don't teach history of mathematics in a math course. We do it for chem and phys.

Last edited by a moderator:
Number Nine
They were derived that way in the sense that someone did it that way (Weierstrass or Cauchy or someone like that), but they were not INVENTED that way. We had most of those rules before we had epsilons and deltas.

The notion a limit predates epsilon-delta, but the development of modern analysis/calculus was impossible without a rigorous definition. If your toolbox contains only the familiar Calculus I limit rules, you're going to be very limited.

homeomorphic
The notion a limit predates epsilon-delta, but the development of modern analysis/calculus was impossible without a rigorous definition. If your toolbox contains only the familiar Calculus I limit rules, you're going to be very limited.

I respectfully disagree. In fact, I can recall a certain interview in which a Nobel Prize winning physicist quoted one his physics profs as accusing the math professors of "wasting your time with epsilons and deltas", and sympathized with this view, himself.

Very limited by a mathematician's standard, yes. If you want to go on in math, and sometimes in other areas that are very mathematical, you have to learn about delta-epsilon proofs. But you can do a lot without them. How many practicing engineers in the field know about epsilon-delta proofs? I'm sure there are some out there who have uses from them, but on the whole, I think it's unlikely that you'll find that most of them know it very well. Yet, they are the ones who are responsible for the most practical stuff that keeps our society running.

I almost got a degree in EE before switching to math. I saw deltas and epsilons in high school and never saw them again until I changed my major to math. If I would have continued down that road, I probably still would have never seen them (unless I went to grad school). And I don't think I would have been terribly hurt as an engineer by it.

Also, you said, "those rules you learned". Most of what you learn in basic calculus can be argued for without epsilons and deltas, and that's the way it was originally.

We had most of those rules before we had epsilons and deltas.
It would be interesting to know what rules existed at what times in history. For example, in Augustus DeMorgan's "The Differential And Integral Calculus" 1842, I don't see any statements corresponding to the commonly used theorems about limits.

How many practicing engineers in the field know about epsilon-delta proofs?

You could also ask how many practicing engineers remember anything about calculus. I think a large number rely on knowing how to use certain software, tables in books etc. The main intellectual weakness that epsilon-delta proofs bring out is a persons inability to deal with logical quantifiers. I suppose one may get through a non-mathematical life without that skill. But that's true of most specialized skills, isn't it?

homeomorphic
It would be interesting to know what rules existed at what times in history. For example, in Augustus DeMorgan's "The Differential And Integral Calculus" 1842, I don't see any statements corresponding to the commonly used theorems about limits.

Interesting question. But I would be shocked if someone like Euler, say, wasn't aware of them. It may have been considered too obvious to even write down. Of course, all the differentiation rules were known and they do implicitly involve limits and limit rules. So, that's where my statement comes from.

You could also ask how many practicing engineers remember anything about calculus. I think a large number rely on knowing how to use certain software, tables in books etc. The main intellectual weakness that epsilon-delta proofs bring out is a persons inability to deal with logical quantifiers. I suppose one may get through a non-mathematical life without that skill. But that's true of most specialized skills, isn't it?

Yes. And that's just what I'm saying. Delta-epsilon proofs are kind of a specialized skill. Not VERY specialized, but enough that only mathematicians or other people doing fairly mathematical things need to know about it. It's very broadly useful in mathematics, though--any mathematician has to know it.

Engineers probably do use too many crutches for their own good. Even so, delta-epsilon proofs are probably not terribly high on the list of things they should know.

You can always argue that anything intellectually challenging is useful to learn, just because it is intellectually challenging. But there are an infinite number of intellectually challenging things to choose from. Why focus on delta-epsilon proofs? If you want to teach logical quantifiers, that's logic.

hmm. well, it seems to me, that, for the most part, people use calculus on continuous (and even better) differentiable functions. i mean, if your function's not differentiable, what's the point of trying to find the derivative it doesn't have?

both of these notions depend on limits. if i were to ask you "how do you tell if a function is continuous", well, how do you do it?

sure, we can establish the continuity of some functions without too much trouble, and as long as we stay in that little family of functions, we can forget about how we established they were continuous in the first place (the "only need to prove something once motto").

but even with functions commonly encountered in (oh say) engineering applications (which typically involve more than just one variable, since the world apparently is, um, don't hit me, manifold), it's not immediately clear at times if a function is continuous or (hopefully) differentiable at a point of genuine interest to us. we can just use our usual bag of tricks, and hope for the best, but that seems a poor way to advance our knowledge of things.

now, to be honest, the essential ideas aren't really dependent on epsilons and delta, but on more general notions, topological ones. as awful as it sounds, epsilon-delta proofs are a sort of kindness, allowing us to use properties of measurement to estimate things. things get MUCH worse, when you no longer have a way of measuring "apartness".

the basic algebraic properties of limits exist, only because the real numbers form a topological field. we're lucky to have such a thing, most things you might analyze, do not possess such a wealth of structure. furthermore, we have a metric. yes, it's more intutively clear to say "f is continuous iff a near the set S implies f(a) near the set f(S)", but unless we can measure "nearness", how does one go about showing that any function at all has this property? doing calculus on a vacuous set of functions seems self-defeating, a lot of work for nothing.

let's take a broader view: one of the many uses of calculus, is in finding approximations to things that might be complex to calculate directly. and a description of a function's qualitative behavior might require computing many values. but if it's locally "going up", that's helpful: a first-order approximation is much better than knowing nothing at all. to use these intuitions, we quite properly ought to be sure that a small perturbation in input, doesn't ruin our results. that's epsilon-delta for you: epsilon is error, and delta is deviation (of input, it would get confusing the use the same letter for both).

so: should an average student prove everything in terms of delta-epsilon methods? no, of course not. should they at least understand the logic involved? certainly.

i, for one, am strongly against teaching people limits sloppily. someone who doesn't understand what this means spatially, is not the person i want teaching physics at my school, nor mathematics, for that matter. the idea requires a certain intellectual maturity to master, and that is in some sense unfortunate, but "cookbook math" isn't math at all, it's cooking.

limits are a subtle concept. hell, real numbers are a subtle concept. things in math aren't true because someone wrote them down in a book somewhere. books contain misprints, even i daresay, integral tables that might be used by a engineer calculating safety parameters on a highway bridge. mathematical theorems are true, because they ARE (not in an epistemological sense, but in the contigent sense of: given this structure, it behaves like that).

if one wants to learn to evaluate integrals, i can suggest a number of handy references, and they can skip the calculus class entirely. or just use mathematica, or whatever. if one wants to learn calculus, then skipping over limits, their defintion, and what they mean, isn't just "skipping dessert", it's missing the main course.

homeomorphic
but "cookbook math" isn't math at all, it's cooking.

Taking deltas and epsilons out has nothing to do with "cookbook". Cookbook means you learn by copying. I'm not 100% against cookbook at all times. I'm just generally against it. The thing is, I understood limit laws intuitively perfectly well when I first studied calculus. They were just intuitively obvious. It wasn't that I was taking anything on faith. Now that I know how to prove everything rigorously, I haven't just thrown out all the intuition. The rigor is something I learned in addition to it, not as a replacement. When I studied calculus, I didn't see the point of deltas and epsilons. I didn't see the point at all. And this is coming from someone who now eats deltas and epsilons for breakfast every day (actually, I'm a topologist, so it's not every day, but I have studied a fair amount of analysis, too). Of course, it was badly taught, I think.

We are so immersed in our math world, it's easy to forget what it's like to be outside of that world. It seems I have less difficulty doing that than some, perhaps owing to my engineering background. And I was one of the engineering students with the deepest understanding, and that is what led me to math. But understanding for engineers, I think, should have more to do with intuition. Also, I have tried to teach students about epsilons and deltas (in tutoring and recitation--I haven't taught my own class), and that also might prompt me to have more realistic demands on the average calculus student. If someone wants more detail, what's stopping them from taking a course in real analysis? That's exactly what I did.

let's take a broader view: one of the many uses of calculus, is in finding approximations to things that might be complex to calculate directly. and a description of a function's qualitative behavior might require computing many values. but if it's locally "going up", that's helpful: a first-order approximation is much better than knowing nothing at all. to use these intuitions, we quite properly ought to be sure that a small perturbation in input, doesn't ruin our results. that's epsilon-delta for you: epsilon is error, and delta is deviation (of input, it would get confusing the use the same letter for both).

I'm not so sure it's necessary to have the formal definition in order to understand things like this.

limits are a subtle concept. hell, real numbers are a subtle concept. things in math aren't true because someone wrote them down in a book somewhere. books contain misprints, even i daresay, integral tables that might be used by a engineer calculating safety parameters on a highway bridge. mathematical theorems are true, because they ARE (not in an epistemological sense, but in the contigent sense of: given this structure, it behaves like that).

I think we agree here, but I put more value in intuition. We mathematicians are the ones whose job it is to split hairs over whether everything is completely correct and rigorous. Engineers and physicists should be more free to use their intuition (and, actually, I think intuition is just as important for mathematicians, but it has to be backed up by proof). Division of labor is important.

if one wants to learn to evaluate integrals, i can suggest a number of handy references, and they can skip the calculus class entirely. or just use mathematica, or whatever. if one wants to learn calculus, then skipping over limits, their defintion, and what they mean, isn't just "skipping dessert", it's missing the main course.

I never said skip limits, just the FORMAL definition for students learning it the first time. I'm not sure how I would teach calculus. You can understand calculus fairly well without doing it 100% rigorously. Just take a look at people like Euler and Lagrange, completely in ignorance or deltas and epsilons. Even Gauss, Riemann. Were they cookbook mathematicians, just because they didn't know about epsilons and deltas? Even Cauchy sort of thought that way, but he didn't say it explicitly like we do know.

Just take a look at people like Euler and Lagrange, completely in ignorance or deltas and epsilons. Even Gauss, Riemann.

That look isn't relevant in deciding what to teach the average calculus student. If you only look at people like Gauss and Riemann, you could say we should let the students teach the class. If a calculus teacher can take a non-rigorous approach and give students the same capabilities as Gauss, that would be great. But does any teacher know how systematize the skills he had?

You can always argue that anything intellectually challenging is useful to learn, just because it is intellectually challenging. But there are an infinite number of intellectually challenging things to choose from. Why focus on delta-epsilon proofs? If you want to teach logical quantifiers, that's logic.

If there is a course in subject X, then the wise thing to do is pick the intellectually challenging material from the topics that are fundamental to understanding the subject. If one could predict each students future and if one had the resources to tailor a calculus course for each student, then i agree that some of of the custom courses could omit epsilons and deltas, but some could omit infinite series, trigonometric functions, etc.

Whether to continue the current teaching of epsilon-delta definitions and proofs should be discussed in the context of how they actually are taught. Although the concepts are often a problem for students, if you look at the material from a mathematical perspective, most textbooks don't ask them to do proofs that are so difficult. Furthermore, the type of logical quantification involved isn't that difficult either. I think the student's get slammed by having to simultaneously deal with a number of relatively simple, but new ideas, all at the same time. They face: 1) Understanding logical quantifiers 2)Understanding a definition of a phrase ("The limit of f(x) as x approaches a is equal to L") instead of understanding the definition of a word (i.e. "Limit") 3) Having to deal with the legalisitic definition of something rather than their intuitive understanding 4) Manipulating inequalities instead of solving equations. I think all these topics should be introduced earlier in their mathematical education instead of postponing them to later.

Homework Helper
epsilon delta methods are needed to give meaning to the word "limit". without them you are just hoping your crude intuition about them is correct and agrees with that of other people. if you look back in history you do find less precision about these words, but you also find that different people meant different things by them. continuity for example used to mean the property now expressed by the intermediate value theorem, but that differs from the meaning of epsilon delta continuity.

the reason the epsilon delta definition of continuity and limits is needed is so you can actually verify that certain limits exist. i.e. it gives you a concrete way to check the truth of what you are being told. you do not em to value that, but prefer to just believe what you are told. this lack of skepticism and intellectual curiosity disqualifies you from being a mathematician, but i do not know if it does so from being an engineer. i tend to think top engineers also want to understand what they are doing and why the rules given them are correct.

homeomorphic
Just take a look at people like Euler and Lagrange, completely in ignorance or deltas and epsilons. Even Gauss, Riemann.

That look isn't relevant in deciding what to teach the average calculus student. If you only look at people like Gauss and Riemann, you could say we should let the students teach the class. If a calculus teacher can take a non-rigorous approach and give students the same capabilities as Gauss, that would be great. But does any teacher know how systematize the skills he had?

I was not saying the average calculus student is the same as Gauss, obviously. I was only pointing out that it is perfectly possible to have a good understanding of calculus and be somewhat non-rigorous.

You can always argue that anything intellectually challenging is useful to learn, just because it is intellectually challenging. But there are an infinite number of intellectually challenging things to choose from. Why focus on delta-epsilon proofs? If you want to teach logical quantifiers, that's logic.

If there is a course in subject X, then the wise thing to do is pick the intellectually challenging material from the topics that are fundamental to understanding the subject.

The material cannot be TOO intellectually challenging. I'm not saying that things need to be dumbed down, but making things any harder than is necessary just results in inefficiency in learning. On his website, T'Hooft (the Nobel Laureate) gives advice on how to learn physics. He purposely picked material that is not very pedagogical because he thinks that it may be a good way to be challenged. There are many disadvantages to this approach. Given the sheer volume of math and physics out there to learn, my favored approach is to make it possible to learn as quickly and easily as humanly possible. You can always give very challenging problems along the way that the students may not even be able to solve. But, my intuition is that the sweet spot is that the material should be just out of the students' comfort zone. Just a little bit. Not a lot.

If one could predict each students future and if one had the resources to tailor a calculus course for each student, then i agree that some of of the custom courses could omit epsilons and deltas, but some could omit infinite series, trigonometric functions, etc.

We can't tailor a course for each student, but we can offer more than one type of course. If you take honors calculus, you asked for it. But even then, as I said, you have to look at the typical student in the class and it should be only a little out of their comfort zone.

Whether to continue the current teaching of epsilon-delta definitions and proofs should discusses in the context of how they actually are taught. Although the concepts are often a problem for students, if you look at the material from a mathematical perspective, most textbooks don't ask them to do proofs that are so difficult. Furthermore, the type of logical quantification involved isn't that difficult either. I think the student's get slammed by having to simultaneously deal with a number of relatively simple, but new ideas, all at the same time. They face: 1) Understanding logical quantifiers 2)Understanding a definition of a phrase ("The limit of f(x) as x approaches a is equal to L") instead of understanding the definition of a word (i.e. "Limit") 3) Having to deal with the legalisitic definition of something rather than their intuitive understanding 4) Manipulating inequalities instead of solving equations. I think all these topics should be introduced earlier in their mathematical education instead of postponing them to later.

That's a good analysis of the difficulties for students. If you wanted to teach it, maybe it would be better to try to focus on just one of those difficulties at a time.

Actually, if you think about it, the delta-epsilon approach usually IS cookbook, in a way. That's what started this whole thread. They just throw the definition at you and expect you to just accept it. But WHY do we need it? Definitions are not relevant just because they are written in a book, any more than statements are true just because they are written in a book.

homeomorphic
epsilon delta methods are needed to give meaning to the word "limit". without them you are just hoping your crude intuition about them is correct and agrees with that of other people. if you look back in history you do find less precision about these words, but you also find that different people meant different things by them. continuity for example used to mean the property now expressed by the intermediate value theorem, but that differs from the meaning of epsilon delta continuity.

Well, the epsilon-delta definition does imply the intermediate value theorem. And I'm not sure if people really meant the intermediate value property, historically. I think they had more of an intuitive idea of an unbroken curve or something like that.

Do I really need deltas and epsilons to be sure that the limit of x^2 as x approaches 2 is 4?

I have nothing against deltas and epsilons, but I don't think that is quite what they are for. Calculus was quite successful for around 200 years, just working without this level of rigor. Working formally is no guarantee of correctness--in fact, the reverse is true. I heard Mike Freedman a few months ago, saying, for him, a proof has to be sort of intuitive because that makes it easier to spot errors! Yes, intuition makes it EASIER to spot errors.

If I sound like I am radical in terms of my "lack of rigor", you haven't seen the proofs I write. They are just like everyone else's proofs. And here, I'm speaking, not about how I personally do things, but how an engineer, for example, should do them.

the reason the epsilon delta definition of continuity and limits is needed is so you can actually verify that certain limits exist. i.e. it gives you a concrete way to check the truth of what you are being told. you do not em to value that, but prefer to just believe what you are told.

Not sure if you are talking about me or the OP, actually. And in either case, I disagree completely. If anything, I am very very extreme in NOT believing what I am told, to the point where it may even cause me problems because I never like to accept what other people say, and I have to spend endless time trying to come up with my own version of everything that I learn.

I write rigorous proofs, the same as everyone else. When I do posts on here, I am usually very informal. I'm perfectly capable of doing everything rigorously.

this lack of skepticism and intellectual curiosity disqualifies you from being a mathematician, but i do not know if it does so from being an engineer.

If you are referring to me, then I'm afraid you are mistaken. If anything, I am too busy trying to understand everything to be able to learn enough to do good research. Other people may be getting ahead of me because they are more willing to take results that other people proved and run with them. So, if I am disqualified to be a mathematician, it's precisely the reverse.

If you are referring to the OP, then, I also disagree. He may just need to study more math to be able to appreciate deltas and epsilons.

i tend to think top engineers also want to understand what they are doing and why the rules given them are correct.

I think so, too. But, it's not always in a mathematical way. I think engineering is often based on non-mathematical practical experience. But, I was only a student of engineering, not a practicing engineer.

if there is a fault in the way calculus is often taught, it's "too much, too soon". it's like the common confusion over the value of a function, and the NAME of a function: the value of a limit isn't what a limit IS.

that type of confusion between the definition of a limit, and the value of a limit, leads people to conclude things like:

0*∞ = 1, because 1/∞ = 0

or:

$$\lim_{n \to \infty} (1+\frac{1}{n})^n = 1$$

since 1+1/n → 1, and 1 "to the anything" is 1.

in calculus, certain qualitative facts come to the fore: if a continuous function is positive at one point, and negative at another, we reason it has to cross the x-axis "in-between". so, we get knowledge about, say, a polynomial function, which might be impossible to obtain algebraically.

not only is this useful, for calculating things that involve really ugly expressions, but opens up a whole new approach to solving hard problems. we might not have to figure out what f(x) actually IS, if we find some simple expression that bounds it.

calculating sin(x)/x, for really small x, is an exercise in torture. approximating it by 1, is deeply satisfying, in a way that is not-so-pretty to express explicitly.

i agree, that more treatment of inequalities should be done FIRST. especially in dealing with absolute value. students struggle with it, because before, everything was: this is that. now it becomes: this might be that, or this might be the other thing. a lot of the trouble with calculus, is "pre-calculus". calculus is like a mountain, it's a harsh mistress, and takes a bit of wooing. people get thrown into it, before they're ready. why do we need sup's and inf's? what is it about the rational numbers that makes them inappropriate for calculus? these things are rarely addressed before-hand, and people have to hit the ground running.

not everyone who takes calculus, will go on to take real analysis, or differential equations, or any of the other higher branches of math that build on what calculus starts. especially for these people, some illumination on the background machinery, is necessary. yes, logic can be hard, with its own quirks and strange language. yes, inequalities are not like equations. yes, computing limits is a path fraught with peril. we need not, nor should we, sugar-coat these things.

flyingpig
Well, the epsilon-delta definition does imply the intermediate value theorem

But there is no need. Isn't IVT one of the "existence theorems"? Why would it need something else to imply it exists?

homeomorphic
But there is no need. Isn't IVT one of the "existence theorems"? Why would it need something else to imply it exists?

I'm not sure what you are trying to say here. In some sense, maybe it's not necessary. After all, it's obvious that if a function is continuous (in the usual, informal sense of the word), then if it attains the value a and the value b, it has to attain every value in between.

As far as the intermediate value theorem, I think it is obvious in certain cases. But what if we have a function that is complicated and we don't understand it very well? What will we do then? Maybe it's defined in some weird way and we don't even know if it's continuous. How will we be able to prove it? The answer is that we need a precise definition. This was Weierstrass's point when he came up with his wacky non-differential, but continuous everywhere function. If you come up with some wacky function, it's not going to be clear that our intuition applies to it. And wacky functions come up in math, so you have to deal with them. For example, one of the motivations for developing modern analysis was to deal with Fourier series of really nasty functions that came up in number theory. Dirichlet had more or less settled the problem for most of the nice functions that come up in physics or engineering.

More rigor can lead to a deeper understanding. For example, why is it that we need the real numbers, not just the rational numbers? It turns out that the intermediate value theorem doesn't work if you use rational numbers. Maybe, that's not too surprising, but maybe you might think we could get away with rational numbers, plus all nth roots or something like that. But, no, it turns out. The key property is the so-called "completeness" of the real numbers. A proper understanding of this requires epsilons and deltas.

Also, sometimes, if you understand things on a deeper level, you can come up with generalizations. The intermediate value theorem turns out to be a special case of something in topology. Any continuous, real-valued function on a connected topological space also has the intermediate value property. What do all those things mean and how did mathematicians come up with them? Deltas and epsilons was the starting point.

So, the answer to the question of why we use epsilons and deltas, I think, is that it provides a deeper understanding of calculus.

But there is no need. Isn't IVT one of the "existence theorems"? Why would it need something else to imply it exists?

because....one of the assumption of the IVT, is that f is continuous on [a,b].

buried in that condition is a hidden "epsilon-delta" condition.

in fact, buried in THAT fact, is the completeness of the reals, after all, f(x) = x2-2 is continuous on Q, and f(2) = 2, while f(0) = -2, but there is, in point of fact, no rational x for which f equals 0.

it is perfectly possible to have a good understanding of calculus and be somewhat non-rigorous.

The problem is to define what a good understanding is.

When people form conspiracies and agree to lie, one of the big problems they have is being consistent.

Intellectually speaking, the art of lying about mathematics on the secondary level is fairly well developed. The non-rigourous explanations of topics in algebra are reasonably consistent from textbook to textbook. Students' understanding of the non-rigorous explanations is consistent enough that the makers of standardized tests can agree on what is a correct answer - insofar as that type of understanding is ever tested.

If a non-rigorous standard for teaching Calculus is to be widely used then you have the problem that all the books have to get their stories straight. (Of course that would also be true if you established an even more rigorous approach.)

It's amusing to think about what such a revision would entail. Perhaps some would advocate teaching the method of Lagrange, where we pretend all functions can be expanded in a power series. There would be debates about whether to say "as x gets closer and closer to A" or "as x approaches A" or "when x gets infinitely close to A".

Consistency is not an impossible task. If there were some mandate to teach calculus to 7 graders , publishers of textbooks would rise to the occasion and the Texas Board Of Education would show us the way.

If calculus were a graduate level elective then each instructor would be permitted to teach it in his own eccentric way. However, as math courses go, it is elementary enough to be a "prerequisite" for many other things. It can't escape being standardized - regardless of what that standard is.

the sweet spot is that the material should be just out of the students' comfort zone. Just a little bit. Not a lot.

It has been said that nothing practical can be done without considering Philosophy. As a Philosophy of life, I think what you describe is an excellent goal. I'm sure many students and instructors would agree with it. A few would not -the Marine Corps, for example.

I think the best way to attain that goal is to teach more logic and axiomatics at the secondary school level. Logic can be taught without even using mathematical examples. In fact, given the uneasy relationship between logic and secondary school math, it might be best to keep them apart.

homeomorphic
Here's a quote from Terence Tao on mathoverflow that I thought would go well here, though the topic was a slightly different one:

http://mathoverflow.net/questions/40082/why-do-we-teach-calculus-students-the-derivative-as-a-limit

I wanted to add one further point to the many good answers already given here: "black box" symbolic computation, in the absence of understanding the formal definitions, can work when everything goes right, but is very unstable with respect to student errors (which are sadly all too common). Knowledge of definitions provides a crucial extra layer of defence against such errors. (Of course, it is not the only such layer; for instance, good mathematical or physical intuition and conceptual understanding are also very important layers of defence, as is knowledge of key examples. But it is a key layer in situations which are too foreign, complicated, or subtle for intuition or experience to be a good guide.)

He echoes the thoughts I expressed here.

I don't advocate any black-boxes at all in calculus or taking anything on faith. It's just that I'm open to non-rigorous "proofs". Perhaps those with weak intuitions are not fully aware of the full power that intuition has. Often, I find, when one intuition is wrong, the solution is not to replace it with some kind of mindless formalism that merely suffices to logically justify the result, but to simply replace it with more precise or more correct intuition. Mindless logical justification with no conceptual understanding is often no better than accepting what other people told you. The way I see it, it's as if you as just listening to what the logic tells you without actually understanding the reasons why it works. So, the way real mathematicians work, ideally, is to have intuition, and to have proof to back it up. Sometimes, you might have a proof that is obscure to conceptual understanding, but that's not a desirable situation, and, while there's nothing wrong with coming up with an argument just to decide whether the thing is true or not, you would hope that there is a better argument out there somewhere.

wisvuze
There's your intuitive "definition" of a limit, then there's the delta-epsilon "definition" of a limit. If you want to understand how a function works in terms of limiting behaviour, it will not necessarily take a delta-epsilon proof to convince yourself. Thus, there are times when you can use another means of "proof" to convince yourself ( this would be something like a "naive proof" ).
However, we wish to speak of this concept of "limit" in its most generality: we have this idea in our heads, and we want to talk about how it works without considering *anything else*. This is when a formalization of this concept is necessary. People want to fit this concept within the rest of mathematics, so we have a formal definition ( for example, so we can start talking a notion of limit in broader contexts ). And in our formal world, we must prove things only based on what our formal world allows us to do ( but this is another can of worms )

homeomorphic
I think the best way to attain that goal is to teach more logic and axiomatics at the secondary school level. Logic can be taught without even using mathematical examples. In fact, given the uneasy relationship between logic and secondary school math, it might be best to keep them apart.

Well, we'd all like to tear apart the curriculum and remake it in the way we see fit. I know I was horribly failed by my education before college. It was only in my senior year of high school when I took physics that I had any idea of what math was about. It would be nice if logic could be covered, but I think you'd have to be careful about it. I would hope students could do some intuitive geometry. After teaching the typical students, my hopes tend not to be high for all these sorts of things--or at least, the problem is a very difficult one. But it's my suspicion that just about everyone has a much higher potential than is currently realized.

To me, the best way to motivate the epsilon-delta definition is the way Bressoud does in ARTRATA. He shows you something that you couldn't accomplish before you had the definition: uniform convergence and its relationship to term by term differentiation and integration. He gives examples of where you run into trouble just naively assuming that it is okay to differentiate term by term. So, he shows you that there are calculus problems that really do require epsilons and deltas. Without understanding uniform convergence and the epsilons and deltas that go with it, we would be at a loss to say when it is okay to do these things.

It turns out that, in the end, skepticism IS a valid argument. But it's not obvious from the outset that it is. From the outset, it is a weak argument. Like the idea that the number i is necessary for solving quadratic equations (though it is not at all clear that that, in turn in necessary). Wrong. It's the cubic that forces us to take complex numbers seriously. The other argument is circular and not at all convincing. Skepticism is applicable to itself. As I pointed out, there's no limit to how skeptical you can be.

Needless to say, this would be considered pretty extravagant to explain to a mere calculus class.

Homework Helper
my hat is off to anyone with enough courage and spare time to take on a thread like this one. i will leave it at that.

you know, i hear what the original poster is saying.

but let's take his view to an extreme. not so much as to disagree with him per se, but to see where it leads.

why prove theorems? after all, we already know that they are true. why not just use them?

why bother with "definitions"? after all, if one is studying math and understands it, these are just so much useless clutter. let's get on to the real action! to heck with the "old problems", let's all find new ones that need solving!

what's wrong with this picture?

homeomorphic
why prove theorems? after all, we already know that they are true. why not just use them?

He's talking about calculus which is a very, very limited context. Most theorems in math, are, I think, completely non-obvious. There are a few, like the Jordan curve theorem, that seem obvious. But, maybe it's not as obvious as it seems because a continuous curve could be a mess.

At any rate, when you have a system of axioms, you have to prove that it agrees with intuition. Maybe the intuition is correct, but the axioms aren't quite the right ones to capture it. One example of this phenomenon was mentioned already, namely, why do we need the real numbers? Why wouldn't some other system work, like just the rationals? So, the axiomatic method sort of requires that we prove everything and make precise definitions. But axiomatic methods are typically the domain of a mathematician, not so much an engineer or physicist.

The point is, skepticism is applicable to itself. You have to argue for these things. It's not obvious from the outset. The OP didn't say, "You guys are stupid for being so skeptical". He just asked what the point was. If someone is threatened by a mere question, maybe they don't really understand why they are doing things the way they are in the first place. I think I understand it better myself than when this thread began.

why bother with "definitions"?

Actually, the context is even more limited than calculus. It's about one particular definition. Without some definition of derivative, how do you even know what those formulas are computing?

Taking someone's view to an extreme doesn't really work because just about any view, taken to an extreme, will make it wrong.

Last edited:
He's talking about calculus which is a very, very limited context. Most theorems in math, are, I think, completely non-obvious. There are a few, like the Jordan curve theorem, that seem obvious. But, maybe it's not as obvious as it seems because a continuous curve could be a mess.

At any rate, when you have system of axioms, you have to prove that it agrees with intuition. Maybe the intuition is correct, but the axioms aren't quite the right ones to capture it. One example of this phenomenon was mentioned already, namely, why do we need the real numbers? Why wouldn't some other system work, like just the rationals? So, the axiomatic method sort of requires that we prove everything and make precise definitions. But axiomatic methods are typically the domain of a mathematician, not so much an engineer or physicist.

The point is, skepticism is applicable to itself. You have to argue for these things. It's not obvious from the outset. The OP didn't say, "You guys are stupid for being so skeptical". He just asked what the point was. If someone is threatened by a mere question, maybe they don't really understand why they are doing things the way they are in the first place. I think I understand it better myself than when this thread began.

Actually, the context is even more limited than calculus. It's about one particular definition. Without some definition of derivative, how do you even know what those formulas are computing?

Taking someone's view to an extreme doesn't really work because just about any view, taken to an extreme, will make it wrong.

i think that that actually IS the point of epsilon-delta proof machinery. not so much to "prove what we already know", but to "understand what we know better". and, i hope you detected at least a smidge of sarcasm in my post.

myself, i think he has a valid question. it's sort of like a "why do we wear seat belts" kind of question. most of the time, we don't need them. but when we do, they are indispensible.

as i noted earlier, what we mean when we say "f is differentiable", is that some sort of limit involving f exists, for certain values of x. and in order to use useful things like rolle's theorem, we need to know when it's "safe" to do so. those "fine print" caveats at the beginning of the important theorems (the mean value theorem, the intermediate value theorem, the fundamental theorem of calculus), have some real teeth if you disregard them.

the other day, on a different forum, someone asked why rolle's theorem didn't apply to:

f(x) = 1 - x2/3 on the interval [-1,1].

a couple of fairly astute people responded that f(-1) = f(1), and that f was continuous, so it should. but some crucial limit fails to exist at a certain point, which ruins everything. and that's not some arcane function, which an engineer might never run across in their entire career, it looks pretty harmless.

so...short answer to the original question: we need some firm definition to guide us, when our "soft" definitions don't work. it's not just a matter of "doing math the purist's way" for it's own sake, it's a matter of having confidence in one's methods, knowing what you know.

homeomorphic
myself, i think he has a valid question. it's sort of like a "why do we wear seat belts" kind of question. most of the time, we don't need them. but when we do, they are indispensible.

I guess I agree with that.

as i noted earlier, what we mean when we say "f is differentiable", is that some sort of limit involving f exists, for certain values of x. and in order to use useful things like rolle's theorem, we need to know when it's "safe" to do so. those "fine print" caveats at the beginning of the important theorems (the mean value theorem, the intermediate value theorem, the fundamental theorem of calculus), have some real teeth if you disregard them.

True, but you can also understand this intuitively, to an extent. It's very easy to come up with counter-examples, intuitively. That's part of my point. Formal definitions are only ONE way to guard against error. Counter-examples and intuition are another. As I remarked in another thread, there can be more rigor in a non-rigorous approach than you might think.

the other day, on a different forum, someone asked why rolle's theorem didn't apply to:

f(x) = 1 - x2/3 on the interval [-1,1].

a couple of fairly astute people responded that f(-1) = f(1), and that f was continuous, so it should. but some crucial limit fails to exist at a certain point, which ruins everything. and that's not some arcane function, which an engineer might never run across in their entire career, it looks pretty harmless.

You're making a stronger case for counter-examples than for epsilons and deltas, I think (the existence of counter-examples is one reason to use epsilons and deltas, but they can often be understood intuitively, too). I won't argue that epsilons and deltas aren't ever relevant. My argument is that it's rare enough that it's sort of optional for a physicist or engineer. If they find, at some point in their career, they need it they can always learn it later. It shouldn't be expected that someone is going to have 100% mastery of the subject the first time they study it, anyway.

Deltas and epsilons aren't the first line of defense, I think. The first line of defense is just more intuition and counter-examples.

Last edited:
i think it's more a matter of information organization (boy, THAT's a mouthful).

for example, in linear algebra, it's more "empowering" to really understand what a basis is good for, than to have lots of computational experience calculating eigenvalues and the like. it's not that computation isn't useful, or that working with n-vectors, matrices (and eventually tensors) is "wrong" it's that the "practical application" parts of linear algebra can obscure the ways in which the full power of linear algebra can be used (apparently, for example, actuaries use it all the time).

a lot of people learning calculus for the physical sciences (or even, gasp! biology), will still encounter, even if at a somewhat less-intense level, functions of more than one variable. if limits for single variables seem intractible at times, multivariate limits can get downright funky. and epsilon-delta views of things really do help wade through the fog.

there's a certain sort of give-and-take that occurs in math: if the definitions are hard, the theorems are easy, and vice versa (note: there are counter-examples). i think the topological concepts (and the geometric intutions you can use along with them) are given short shrift when limits are "glossed over".

but sure...for an average calculus student, it's straight from the quadratic formula and a little trig to...wtf? open set? least upper bound? where did my nice algebra that i was finally getting the hang of go? what's an "existential quantifier" and why is it bad if i get the "for all" and the "there exists" in the wrong order?

the thing is: a little density can go a long way. topological methods are starting to show up in the most unlikely of places: statistical analysis (like demographic analysis for advertisers), security system design, biological classification. a metric is a powerful concept, and it's useful for more than just "number-crunching" types of math.

perhaps in the class the thread starter is taking, the limits being examined are ones that aren't too pathological, reinforcing the idea that "usually" we can do without all this complicated stuff. personally, i like the idea of basing calculus on "nearness" (which actually implies the epsilon-delta definition) and is closer in spirit to the general requirement in topological spaces that f-1(U) is open if U is. "neighborhood" is a nice, friendly word, and helps loosen up the formality of the language.

khan academy has some really nice videos where he breaks down visually where the epsilon and the delta comes from. when you see it graphically, it makes more sense. i'll admit, there's a certain dryness and lack of humor permeating calculus, all of a sudden it's: stuff just got real.

*****

f(x) = 1 - (x2)1/3 was what i meant. silly me.

homeomorphic
for example, in linear algebra, it's more "empowering" to really understand what a basis is good for, than to have lots of computational experience calculating eigenvalues and the like. it's not that computation isn't useful, or that working with n-vectors, matrices (and eventually tensors) is "wrong" it's that the "practical application" parts of linear algebra can obscure the ways in which the full power of linear algebra can be used (apparently, for example, actuaries use it all the time).

I'm all for a more theoretical approach, in general, with the caveat that it's an ideal and the students are not ideal. We're arguing over a very specific point, here. I see it as quite out of place to come to more general conclusions about a general philosophy of teaching, here. I'm advocating a more intuitive approach, which means mindless computation is de-emphasized.

a lot of people learning calculus for the physical sciences (or even, gasp! biology), will still encounter, even if at a somewhat less-intense level, functions of more than one variable. if limits for single variables seem intractible at times, multivariate limits can get downright funky. and epsilon-delta views of things really do help wade through the fog.

Well, the tricky limits for multivariable functions that I am aware of are the ones I saw in my analysis class in undergrad. You may have a point here, but you'd have to investigate whether this really comes up in applications or not and whether it's really necessary to use delta epsilon or is it enough to have counter-examples and intuition. From what I've seen of physics and engineering, I never had to take any weird multivariable limits, and being a student often involves more theory. But my experience is limited.

The only completely reliable way is to be rigorous, but rigor isn't some kind of magic thing that automatically prevents all mistakes, either. The only way in which it is completely reliable is if you don't make any mistakes (when you just work intuitively, it may not always be 100% clear if you have made mistakes or not). So, it's slightly circular to argue that it's going to prevent mistakes, especially give that most non-mathematicians find it very difficult. So, part of the problem is that, normally, it's a moot point whether you teach epsilons and deltas or not because most of the students don't get it, and even if they do, they may soon forget.

there's a certain sort of give-and-take that occurs in math: if the definitions are hard, the theorems are easy, and vice versa (note: there are counter-examples). i think the topological concepts (and the geometric intutions you can use along with them) are given short shrift when limits are "glossed over".

That's a different issue. It may be correlated with but is not caused by glossing over the precise definition is of a limit. As I said, we're talking about something very specific, namely, deltas and epsilons--that is maybe one section of the textbook and 1 or 2 lectures.

but sure...for an average calculus student, it's straight from the quadratic formula and a little trig to...wtf? open set? least upper bound? where did my nice algebra that i was finally getting the hang of go? what's an "existential quantifier" and why is it bad if i get the "for all" and the "there exists" in the wrong order?

Yes, if you're going to do it, do it right. Don't slam them with things they are unprepared for. If the students are miserable and don't learn it anyway, all you will accomplish is to build resentment.

the thing is: a little density can go a long way. topological methods are starting to show up in the most unlikely of places: statistical analysis (like demographic analysis for advertisers), security system design, biological classification. a metric is a powerful concept, and it's useful for more than just "number-crunching" types of math.

It takes time to learn all that. There are different kinds of engineers. Some engineers like math. If they want to get a double major, that is great. I recommend it. I am not discouraging anyone from pursuing math to whatever degree they wish. If they are really mathematical, then they can pursue an academic career in control theory and be just like a mathematician. All I'm saying is if they aren't interested in it, it's optional whether they want do epsilons and deltas. That's not true quite for everything. They have to do certain things in order to be good at what they do and they may not like 100% of what they have to do. They have to decide whether the bad parts of the career outweigh the good, and if so, they can do something else.

khan academy has some really nice videos where he breaks down visually where the epsilon and the delta comes from. when you see it graphically, it makes more sense. i'll admit, there's a certain dryness and lack of humor permeating calculus, all of a sudden it's: stuff just got real.

That's a good point. Even though I actually have a very intuitive understanding of the epsilon-delta definition, perhaps, in my manner of speaking, I have set up a false dichotomy between that stuff and intuition (though I was well aware of this all along). Epsilons and deltas have their own intuition.

f(x) = 1 - (x2)1/3 was what i meant. silly me.

Yeah, not differentiable at every point. You can see it visually if you plot it. And the plot is understandable. You don't have to take the computer's word for it.