How Many Points Are Needed to Apply the Delta-Epsilon Definition of a Limit?

  • Thread starter Thread starter mrxtothaz
  • Start date Start date
  • Tags Tags
    Proofs
mrxtothaz
Messages
14
Reaction score
0
I am in the process of learning limits and there are a few things I would like to ask.

1) In order to apply the limit definition, you can't just have one point because there is no notion of 'approaching' a limit.

I would like to play around with the limit concept by understanding some of the boundaries of the definition. What are the minimum conditions to be satisfied in order for the limit definition (epsilon-delta, more specifically) to be applied?

Surely there must be some interval about a point; but can you just have 2 points? 3? Can the points be discrete?

2) Whenever I search the internet for help with delta-epsilon proofs, I come across a lot dealing with finding a delta in terms of a given epsilon (for a variety of functions). But is there anything I can find (whether in book form or something on the internet) usage of the delta-epsilon definition that's a bit more proof-centric... like something more analytical?

We are told that delta-epsilon is rigorous and its importance is heavily stressed; yet in all the major textbooks I have glanced, it is the case that such proofs are limited to the sections on limits (and sometimes continuity), but afterwards the widespread limit notation is adopted. This may not be unreasonable, as I understand that delta-epsilon notation is helpful to conceptualize the notion of a limit; that we have advanced our understanding with limits, you can often communicate the same thing using simpler notation. But surely there must be something available that might show proofs that typically don't use delta-epsilon notation (since major proofs seem to be standardized in all textbooks), despite how tedious that may be. This inquiry was prompted by encountering two instances in my book where the author did a proof and left it to the student to supply a delta-epsilon argument. But this is something I'm pretty unclear about and for which I seek instructive examples. Below I will include the relevant proofs from my textbook if someone can help:

1st Example:
http://imgur.com/OQlT4.jpg
2nd Example:
Statement of Theorem - http://imgur.com/uZNoT.jpg
Proof of Theorem - http://imgur.com/A4h1s.jpg
 
Physics news on Phys.org
If your domain is just a finite set of points or something, then you can still apply the delta-epsilon definition of a limit. You have what's known as a metric space; basically a set of points with a distance measurement. The real numbers are a special example, and obviously any subset of the real numbers is also a metric space. The definition of a limit can be applied to any metric space, and many (but not all) of the theorems you get for the real numbers generalize to any metric space.

If you have a discrete domain, then you get what's known as the discrete topology (a topology is a collection of all the open sets). In this case every subset of your domain is open, and every function ends up being continuous. Not a very interesting example

For your first example the question is why is \lim_{h\to 0} f(\alpha_h) = \lim_{x\to a} f'(x). Let's consider the definition of each.

We know that \lim_{x\to a} f'(x)=L for some L. This means that for all \epsilon, there exists \delta (which depends on \epsilon) such that |x-a|<\delta implies |f'(x)-L|<\epsilon

Now if you give me \epsilon, I claim that if|h|< \delta for the \delta we found above, we have |f'(\alpha_h)-L|<\epsilon. This is because we know that \alpha_h\in (a,a+h) so |\alpha_h-a|<|h|<\delta. And we know from the limit of f'(x) above that if |\alpha_h-a|<\delta, |f'(\alpha_h)-L|<\epsilon (using \alpha_h=x above)The objective of the delta-epsilon definition is really to stop using it as soon as possible. It's pretty arduous for even some of the more basic arguments that you can make, and once the basic theorems about limits and continuity are understood you want to stop using it whenever possible. This is true in a lot of fields of mathematics; the starting point in terms of definitions is very small, and you try to build up to get better machinery that's easier to use. Then you stop using your original definitions except for when it's absolutely necessary
 
Back
Top