# What's the purpose of Epsilon proofs for limits?

1. Nov 10, 2011

### flyingpig

In all the problems I have done so far, the limit was already given. So the goal is to utilize the theorem to see whether the limit really holds.

But what's the point? If we already know how to find the limit, why must we go through a process of ingenuity algebra to tell ourselves, "okay it works and it only took us [insert a very long time]"

Are there people who are that skeptical? Yes I realize the math was created before computers, but what's the point of having it now? If you seriously need a theorem with fancy words and symbols to confirm that your limit is right, doesn't that mean you are just unsure of what you are doing?

I mean I finally understood how to use Epsilon definition (and it's true meaning thanks Mark44), but it looks like I am just computing limits backwards with this method

2. Nov 10, 2011

### Number Nine

Epsilon-delta is the definition of a limit. All those handy rules you learned were derived from epsilon-delta arguments. It's somewhat important.

3. Nov 10, 2011

### flyingpig

I thought the limit laws were, but not actually the results.

4. Nov 10, 2011

### Stephen Tashi

Yes, it does. And being unsure is the proper outlook if what you are doing is guessing limits "just by looking at them" or using numerical computations. If people could do that reliably then there would be no need for rigorous proofs. Experience has shown that people cannot reliably determine limits "by inspection" or by computation. Hence the need for proofs. Teaching material does use simple examples. Most textbook exercises ask students to solve things that the World has already solved. That doesn't imply that the solution techniques aren't needed to do more sophisticated problems.

5. Nov 10, 2011

### homeomorphic

They were derived that way in the sense that someone did it that way (Weierstrass or Cauchy or someone like that), but they were not INVENTED that way. We had most of those rules before we had epsilons and deltas.

He's quite right to ask what the point is. In fact, it took a lot of effort on Weierstrass's part to convince the mathematical community that it was needed. For a long time after calculus was invented, no one thought of epsilons and deltas. Cauchy started doing things almost that way, maybe early 1800s, and Weierstrass put it in the form that we see today a little later. Calculus was invented in the 1600s! It wasn't at all obvious that epsilons and deltas were needed back then, and it isn't any more obvious to students studying calculus today. In no way should we just take it for granted that epsilons and deltas are needed because my daddy said so and my daddy is a mathematician.

I suggest reading A Radical Approach to Real Analysis. You don't have to read the whole thing, but the first couple chapters get into this. You might take the history with a grain of salt, but I think the thrust of it is right.

Part of the reason for using the formal definition is to put things on a firmer footing (maybe more for proving more results than for showing things that were previously pretty clearly true, even without 100% rigor). But, maybe a stronger reason in some ways, is that mathematicians ran into problems when just using their intuition. They ran into situations where it wasn't clear what was going on, so they had to be a bit more careful in order to sort it out. Things like differentiating series term by term. When is it okay to do that? In a large number of examples, it doesn't cause any trouble. But, if you look at the right example, you start running into problems if you aren't rigorous. Things go wrong. So, you have to analyze what exactly goes wrong and what goes right when it does work. There's a reason to be skeptical. You can find examples that FORCE us to be so skeptical.

And we may not have been so sure of ourselves before, but now equipped with our epsilons and deltas, we CAN be sure of what we are doing.

So, it's a tool for exploring new territory, and, for dealing with some tricky problems, it can be a necessity.

I personally don't believe in introducing epsilons and deltas in ordinary introductory calculus class. It's a topic for math majors or for other people who just happen to need it.

6. Nov 10, 2011

### pwsnafu

Consider the integral
$\int_0^\infty \cos(2x) \prod_{n=1}^{\infty} \cos(x / n) \, dx$
0.392699081698724154807830422909937860524645434187231595926812285162...
Get out a calculator and find out $\frac{\pi}{8}$. Notice something?

Oh, definitely. Specifically it required his http://en.wikipedia.org/wiki/Weierstrass_function" [Broken]. It's a pitty we don't teach history of mathematics in a math course. We do it for chem and phys.

Last edited by a moderator: May 5, 2017
7. Nov 10, 2011

### Number Nine

The notion a limit predates epsilon-delta, but the development of modern analysis/calculus was impossible without a rigorous definition. If your toolbox contains only the familiar Calculus I limit rules, you're going to be very limited.

8. Nov 10, 2011

### homeomorphic

I respectfully disagree. In fact, I can recall a certain interview in which a Nobel Prize winning physicist quoted one his physics profs as accusing the math professors of "wasting your time with epsilons and deltas", and sympathized with this view, himself.

Very limited by a mathematician's standard, yes. If you want to go on in math, and sometimes in other areas that are very mathematical, you have to learn about delta-epsilon proofs. But you can do a lot without them. How many practicing engineers in the field know about epsilon-delta proofs? I'm sure there are some out there who have uses from them, but on the whole, I think it's unlikely that you'll find that most of them know it very well. Yet, they are the ones who are responsible for the most practical stuff that keeps our society running.

I almost got a degree in EE before switching to math. I saw deltas and epsilons in high school and never saw them again until I changed my major to math. If I would have continued down that road, I probably still would have never seen them (unless I went to grad school). And I don't think I would have been terribly hurt as an engineer by it.

Also, you said, "those rules you learned". Most of what you learn in basic calculus can be argued for without epsilons and deltas, and that's the way it was originally.

9. Nov 11, 2011

### Stephen Tashi

It would be interesting to know what rules existed at what times in history. For example, in Augustus DeMorgan's "The Differential And Integral Calculus" 1842, I don't see any statements corresponding to the commonly used theorems about limits.

10. Nov 11, 2011

### Stephen Tashi

You could also ask how many practicing engineers remember anything about calculus. I think a large number rely on knowing how to use certain software, tables in books etc. The main intellectual weakness that epsilon-delta proofs bring out is a persons inability to deal with logical quantifiers. I suppose one may get through a non-mathematical life without that skill. But that's true of most specialized skills, isn't it?

11. Nov 11, 2011

### homeomorphic

Interesting question. But I would be shocked if someone like Euler, say, wasn't aware of them. It may have been considered too obvious to even write down. Of course, all the differentiation rules were known and they do implicitly involve limits and limit rules. So, that's where my statement comes from.

Yes. And that's just what I'm saying. Delta-epsilon proofs are kind of a specialized skill. Not VERY specialized, but enough that only mathematicians or other people doing fairly mathematical things need to know about it. It's very broadly useful in mathematics, though--any mathematician has to know it.

Engineers probably do use too many crutches for their own good. Even so, delta-epsilon proofs are probably not terribly high on the list of things they should know.

You can always argue that anything intellectually challenging is useful to learn, just because it is intellectually challenging. But there are an infinite number of intellectually challenging things to choose from. Why focus on delta-epsilon proofs? If you want to teach logical quantifiers, that's logic.

12. Nov 12, 2011

### Deveno

hmm. well, it seems to me, that, for the most part, people use calculus on continuous (and even better) differentiable functions. i mean, if your function's not differentiable, what's the point of trying to find the derivative it doesn't have?

both of these notions depend on limits. if i were to ask you "how do you tell if a function is continuous", well, how do you do it?

sure, we can establish the continuity of some functions without too much trouble, and as long as we stay in that little family of functions, we can forget about how we established they were continuous in the first place (the "only need to prove something once motto").

but even with functions commonly encountered in (oh say) engineering applications (which typically involve more than just one variable, since the world apparently is, um, don't hit me, manifold), it's not immediately clear at times if a function is continuous or (hopefully) differentiable at a point of genuine interest to us. we can just use our usual bag of tricks, and hope for the best, but that seems a poor way to advance our knowledge of things.

now, to be honest, the essential ideas aren't really dependent on epsilons and delta, but on more general notions, topological ones. as awful as it sounds, epsilon-delta proofs are a sort of kindness, allowing us to use properties of measurement to estimate things. things get MUCH worse, when you no longer have a way of measuring "apartness".

the basic algebraic properties of limits exist, only because the real numbers form a topological field. we're lucky to have such a thing, most things you might analyze, do not possess such a wealth of structure. furthermore, we have a metric. yes, it's more intutively clear to say "f is continuous iff a near the set S implies f(a) near the set f(S)", but unless we can measure "nearness", how does one go about showing that any function at all has this property? doing calculus on a vacuous set of functions seems self-defeating, a lot of work for nothing.

let's take a broader view: one of the many uses of calculus, is in finding approximations to things that might be complex to calculate directly. and a description of a function's qualitative behavior might require computing many values. but if it's locally "going up", that's helpful: a first-order approximation is much better than knowing nothing at all. to use these intuitions, we quite properly ought to be sure that a small perturbation in input, doesn't ruin our results. that's epsilon-delta for you: epsilon is error, and delta is deviation (of input, it would get confusing the use the same letter for both).

so: should an average student prove everything in terms of delta-epsilon methods? no, of course not. should they at least understand the logic involved? certainly.

i, for one, am strongly against teaching people limits sloppily. someone who doesn't understand what this means spatially, is not the person i want teaching physics at my school, nor mathematics, for that matter. the idea requires a certain intellectual maturity to master, and that is in some sense unfortunate, but "cookbook math" isn't math at all, it's cooking.

limits are a subtle concept. hell, real numbers are a subtle concept. things in math aren't true because someone wrote them down in a book somewhere. books contain misprints, even i daresay, integral tables that might be used by a engineer calculating safety parameters on a highway bridge. mathematical theorems are true, because they ARE (not in an epistemological sense, but in the contigent sense of: given this structure, it behaves like that).

if one wants to learn to evaluate integrals, i can suggest a number of handy references, and they can skip the calculus class entirely. or just use mathematica, or whatever. if one wants to learn calculus, then skipping over limits, their defintion, and what they mean, isn't just "skipping dessert", it's missing the main course.

13. Nov 12, 2011

### homeomorphic

Taking deltas and epsilons out has nothing to do with "cookbook". Cookbook means you learn by copying. I'm not 100% against cookbook at all times. I'm just generally against it. The thing is, I understood limit laws intuitively perfectly well when I first studied calculus. They were just intuitively obvious. It wasn't that I was taking anything on faith. Now that I know how to prove everything rigorously, I haven't just thrown out all the intuition. The rigor is something I learned in addition to it, not as a replacement. When I studied calculus, I didn't see the point of deltas and epsilons. I didn't see the point at all. And this is coming from someone who now eats deltas and epsilons for breakfast every day (actually, I'm a topologist, so it's not every day, but I have studied a fair amount of analysis, too). Of course, it was badly taught, I think.

We are so immersed in our math world, it's easy to forget what it's like to be outside of that world. It seems I have less difficulty doing that than some, perhaps owing to my engineering background. And I was one of the engineering students with the deepest understanding, and that is what led me to math. But understanding for engineers, I think, should have more to do with intuition. Also, I have tried to teach students about epsilons and deltas (in tutoring and recitation--I haven't taught my own class), and that also might prompt me to have more realistic demands on the average calculus student. If someone wants more detail, what's stopping them from taking a course in real analysis? That's exactly what I did.

I'm not so sure it's necessary to have the formal definition in order to understand things like this.

I think we agree here, but I put more value in intuition. We mathematicians are the ones whose job it is to split hairs over whether everything is completely correct and rigorous. Engineers and physicists should be more free to use their intuition (and, actually, I think intuition is just as important for mathematicians, but it has to be backed up by proof). Division of labor is important.

I never said skip limits, just the FORMAL definition for students learning it the first time. I'm not sure how I would teach calculus. You can understand calculus fairly well without doing it 100% rigorously. Just take a look at people like Euler and Lagrange, completely in ignorance or deltas and epsilons. Even Gauss, Riemann. Were they cookbook mathematicians, just because they didn't know about epsilons and deltas? Even Cauchy sort of thought that way, but he didn't say it explicitly like we do know.

14. Nov 12, 2011

### Stephen Tashi

That look isn't relevant in deciding what to teach the average calculus student. If you only look at people like Gauss and Riemann, you could say we should let the students teach the class. If a calculus teacher can take a non-rigorous approach and give students the same capabilities as Gauss, that would be great. But does any teacher know how systematize the skills he had?

If there is a course in subject X, then the wise thing to do is pick the intellectually challenging material from the topics that are fundamental to understanding the subject. If one could predict each students future and if one had the resources to tailor a calculus course for each student, then i agree that some of of the custom courses could omit epsilons and deltas, but some could omit infinite series, trigonometric functions, etc.

Whether to continue the current teaching of epsilon-delta definitions and proofs should be discussed in the context of how they actually are taught. Although the concepts are often a problem for students, if you look at the material from a mathematical perspective, most textbooks don't ask them to do proofs that are so difficult. Furthermore, the type of logical quantification involved isn't that difficult either. I think the student's get slammed by having to simultaneously deal with a number of relatively simple, but new ideas, all at the same time. They face: 1) Understanding logical quantifiers 2)Understanding a definition of a phrase ("The limit of f(x) as x approaches a is equal to L") instead of understanding the definition of a word (i.e. "Limit") 3) Having to deal with the legalisitic definition of something rather than their intuitive understanding 4) Manipulating inequalities instead of solving equations. I think all these topics should be introduced earlier in their mathematical education instead of postponing them to later.

15. Nov 12, 2011

### mathwonk

epsilon delta methods are needed to give meaning to the word "limit". without them you are just hoping your crude intuition about them is correct and agrees with that of other people. if you look back in history you do find less precision about these words, but you also find that different people meant different things by them. continuity for example used to mean the property now expressed by the intermediate value theorem, but that differs from the meaning of epsilon delta continuity.

the reason the epsilon delta definition of continuity and limits is needed is so you can actually verify that certain limits exist. i.e. it gives you a concrete way to check the truth of what you are being told. you do not em to value that, but prefer to just believe what you are told. this lack of skepticism and intellectual curiosity disqualifies you from being a mathematician, but i do not know if it does so from being an engineer. i tend to think top engineers also want to understand what they are doing and why the rules given them are correct.

16. Nov 12, 2011

### homeomorphic

I was not saying the average calculus student is the same as Gauss, obviously. I was only pointing out that it is perfectly possible to have a good understanding of calculus and be somewhat non-rigorous.

The material cannot be TOO intellectually challenging. I'm not saying that things need to be dumbed down, but making things any harder than is necessary just results in inefficiency in learning. On his website, T'Hooft (the Nobel Laureate) gives advice on how to learn physics. He purposely picked material that is not very pedagogical because he thinks that it may be a good way to be challenged. There are many disadvantages to this approach. Given the sheer volume of math and physics out there to learn, my favored approach is to make it possible to learn as quickly and easily as humanly possible. You can always give very challenging problems along the way that the students may not even be able to solve. But, my intuition is that the sweet spot is that the material should be just out of the students' comfort zone. Just a little bit. Not a lot.

We can't tailor a course for each student, but we can offer more than one type of course. If you take honors calculus, you asked for it. But even then, as I said, you have to look at the typical student in the class and it should be only a little out of their comfort zone.

That's a good analysis of the difficulties for students. If you wanted to teach it, maybe it would be better to try to focus on just one of those difficulties at a time.

Actually, if you think about it, the delta-epsilon approach usually IS cookbook, in a way. That's what started this whole thread. They just throw the definition at you and expect you to just accept it. But WHY do we need it? Definitions are not relevant just because they are written in a book, any more than statements are true just because they are written in a book.

17. Nov 12, 2011

### homeomorphic

Well, the epsilon-delta definition does imply the intermediate value theorem. And I'm not sure if people really meant the intermediate value property, historically. I think they had more of an intuitive idea of an unbroken curve or something like that.

Do I really need deltas and epsilons to be sure that the limit of x^2 as x approaches 2 is 4?

I have nothing against deltas and epsilons, but I don't think that is quite what they are for. Calculus was quite successful for around 200 years, just working without this level of rigor. Working formally is no guarantee of correctness--in fact, the reverse is true. I heard Mike Freedman a few months ago, saying, for him, a proof has to be sort of intuitive because that makes it easier to spot errors! Yes, intuition makes it EASIER to spot errors.

If I sound like I am radical in terms of my "lack of rigor", you haven't seen the proofs I write. They are just like everyone else's proofs. And here, I'm speaking, not about how I personally do things, but how an engineer, for example, should do them.

Not sure if you are talking about me or the OP, actually. And in either case, I disagree completely. If anything, I am very very extreme in NOT believing what I am told, to the point where it may even cause me problems because I never like to accept what other people say, and I have to spend endless time trying to come up with my own version of everything that I learn.

I write rigorous proofs, the same as everyone else. When I do posts on here, I am usually very informal. I'm perfectly capable of doing everything rigorously.

If you are referring to me, then I'm afraid you are mistaken. If anything, I am too busy trying to understand everything to be able to learn enough to do good research. Other people may be getting ahead of me because they are more willing to take results that other people proved and run with them. So, if I am disqualified to be a mathematician, it's precisely the reverse.

If you are referring to the OP, then, I also disagree. He may just need to study more math to be able to appreciate deltas and epsilons.

I think so, too. But, it's not always in a mathematical way. I think engineering is often based on non-mathematical practical experience. But, I was only a student of engineering, not a practicing engineer.

18. Nov 12, 2011

### Deveno

if there is a fault in the way calculus is often taught, it's "too much, too soon". it's like the common confusion over the value of a function, and the NAME of a function: the value of a limit isn't what a limit IS.

that type of confusion between the definition of a limit, and the value of a limit, leads people to conclude things like:

0*∞ = 1, because 1/∞ = 0

or:

$$\lim_{n \to \infty} (1+\frac{1}{n})^n = 1$$

since 1+1/n → 1, and 1 "to the anything" is 1.

in calculus, certain qualitative facts come to the fore: if a continuous function is positive at one point, and negative at another, we reason it has to cross the x-axis "in-between". so, we get knowledge about, say, a polynomial function, which might be impossible to obtain algebraically.

not only is this useful, for calculating things that involve really ugly expressions, but opens up a whole new approach to solving hard problems. we might not have to figure out what f(x) actually IS, if we find some simple expression that bounds it.

calculating sin(x)/x, for really small x, is an exercise in torture. approximating it by 1, is deeply satisfying, in a way that is not-so-pretty to express explicitly.

i agree, that more treatment of inequalities should be done FIRST. especially in dealing with absolute value. students struggle with it, because before, everything was: this is that. now it becomes: this might be that, or this might be the other thing. a lot of the trouble with calculus, is "pre-calculus". calculus is like a mountain, it's a harsh mistress, and takes a bit of wooing. people get thrown into it, before they're ready. why do we need sup's and inf's? what is it about the rational numbers that makes them inappropriate for calculus? these things are rarely addressed before-hand, and people have to hit the ground running.

not everyone who takes calculus, will go on to take real analysis, or differential equations, or any of the other higher branches of math that build on what calculus starts. especially for these people, some illumination on the background machinery, is necessary. yes, logic can be hard, with its own quirks and strange language. yes, inequalities are not like equations. yes, computing limits is a path fraught with peril. we need not, nor should we, sugar-coat these things.

19. Nov 12, 2011

### flyingpig

But there is no need. Isn't IVT one of the "existence theorems"? Why would it need something else to imply it exists?

20. Nov 12, 2011

### homeomorphic

I'm not sure what you are trying to say here. In some sense, maybe it's not necessary. After all, it's obvious that if a function is continuous (in the usual, informal sense of the word), then if it attains the value a and the value b, it has to attain every value in between.