Are Delta and Epsilon Formulas Universally Applicable for Polynomial Limits?

  • Thread starter Thread starter Orion1
  • Start date Start date
  • Tags Tags
    Limits
Orion1
Messages
961
Reaction score
3

I have made some observations regarding the Precise Limit Definition.

For any given polynomial:
ax^n

The solution for delta is:
\boxed{\delta = \frac{\epsilon^{\frac{1}{n}}}{|a|}}

The solution for epsilon is:
\boxed{\epsilon = (|a| \delta)^n}

My Calculus textbook determines the values for delta and epsilon experimentally based upon the primary numerator function, however these equations have worked for every problem that was assigned to me.

Are these solutions correct?

What are the possibilities that there is a theorem that may exist that determines the solutions for ALL deltas and epsilons? :rolleyes:
[/Color]
 
Physics news on Phys.org
filling in the gaps you mean that if |x|<d then |ax^n|<e (d and e being delta and epsilon)? Well, as given what you wrote is incorrect; just rearrange the second term: you need to take nth roots of |a| not just e.
 

Here is a typical problem given in class and already corrected by my Calculus professor:

\lim_{x \rightarrow 3} \frac{x}{5} = \frac{3}{5}
\left| \frac{x}{5} - \frac{3}{5} \right| &lt; \epsilon \; \text{when} \; 0 &lt; |x - 3| &lt; \delta
\frac{1}{5} |x - 3| &lt; \epsilon \Rightarrow |x - 3| &lt; 5 \epsilon
\delta = 5 \epsilon
\text{Let} \; \epsilon &gt; 0 \; \text{when} \; \delta = 5 \epsilon
\text{If} \; 0 &lt; |x - 3| &lt; \delta \; \text{when} \; \left| \frac{x}{5} - \frac{3}{5} \right| = \frac{1}{5} |x - 3| &lt; \frac{\delta}{5}
\boxed{ \frac{\delta}{5} = \frac{1}{5} (5 \epsilon) = \epsilon}

Here is my polynomial theorem:
\lim_{x \rightarrow a} f(x) = L
|f(x) - L| &lt; \epsilon \; \text{when} \; 0 &lt; |x - a| &lt; \delta
|f(x) - L| = |a_1(x - a)^n|
|a_1 x^n - L| &lt; \epsilon \; \text{when} \; 0 &lt; |x - a| &lt; \delta
\delta = \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}}
\text{Let} \; \epsilon &gt; 0 \; \text{when} \; \delta = \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}}
\text{If} \; 0 &lt; |x - a| &lt; \delta \; \text{when} \; |a_1 x^n - L| &lt; |a_1| \delta^n
\boxed{|a_1| \delta^n = |a_1| \left[ \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}} \right]^n = \epsilon}

[/Color]
 
Last edited:
consider f(x)=x^2/2 and the limit as x tends to 0.

Given espilon *you* want to let d=2sqrt(e) to conclude that

when |x|<d then |x^2/2|<e

but that it isn't true. What is true is that |x^2/2|<2e

Here are some more details:

Given e>0 you can by some algorithmic method work out d in terms of e for *some* polys as I will explain below.

Now, what you're professor is doing (hopefully with more words on the board) is do something for a specific type of polynomial (one that factors nicely at f(x)-L) and you've picked the wrong generalization.

Firstly apply it to the case x^2/2 ie where a_1 fails to be 1.

If |x|<d then |x^2/2| <d^2/2.

if d^2/2<e we've got the result, ie d^2<2e (or d<sqrt(2e) if you like).

This shows that in the particularly nice case when f(x)-L = k(x-m)^n then if |x-m|<d then |f(x)-L|<kd^n ie we need to let d=(e/k)^{1/n}

Notice how the constant is subsumed inside the n'th root not outside as you had it?

Now, in general what happens? Well, we can't be so specific.

consider f(x)-L Now to prove this converges to zero as x tends to m then we are assuming x-m is a root of f(x)-L, or the f(x)-L=(x-m)g(x) for some polynomial. It so happens the cases you know are nice.

So how would we prove this actually converges to 0? The key is that g(x) is bounded near x=m. This bound *is* realizable in terms of the coefficients of g(x) which are realizable in terms of coefficients of f the number m. If you want to get a rigourous bound for this then you can use various tools such as:

|b_0 + b_1x +...+ b_rx^r|< |b_0| + |b_1||x| +...+ |b_r||x|^r

which in turn is less than rmax{|b_i|}max{1,|x|^r}

the first max is in terms of coefficients and the second can be worked out since we can assume we are looking for a d<1 so that |x-m|<1 or -m-1<x<m+1. Let us assume this bound is M, then d to be less than e/M and we're done.

But it isn't as nice in general as what you want.
 
Last edited:


Interesting, my Calculus textbook does not list a single example of a polynomial with a fractional coeffecient raised to an integer power in this specific section, which explains alot.

I will try again with a better random example using the new theorem.

\lim_{x \rightarrow 2} \frac{x^3}{4} = 2
a = 2 \; \; \; a_1 = \frac{1}{4} \; \; \; n = 3 \; \; \; L = 2
\left| \frac{x^3}{4} - 2 \right| &lt; \epsilon \; \text{when} \; 0 &lt; |x - 2| &lt; \delta
\delta = \left( |4| \epsilon} \right)^{\frac{1}{3}}
\text{Given} \; \epsilon &gt; 0 \; \text{let} \; \delta = \left( |4| \epsilon} \right)^{\frac{1}{3}}
\text{If} \; 0 &lt; |x - 2| &lt; \delta \; \text{when} \; \left| \frac{x^3}{4} - 2 \right| &lt; \frac{\delta^3}{|4|}
\boxed{\frac{\delta^3}{|4|} = \left| \frac{1}{4} \right| \left[ \left( |4| \epsilon \right)^{\frac{1}{3}} \right]^3 = \epsilon}

Is this solution correct?
[/Color]
 
Last edited:
x^3-8 is not (x-2)^3, so what you've written is not correct. If you wre actually looking at the polynomial \frac{(x-2)^3}{4} then what is there is 'the right idea' but I dislike the presentation. That is just personal: I prefer words to unmotivated symbols. And it should read

given e>o let d=...

and last time i checked 4 was a positive number.

Of course the idea of carefully choosing delta such that something is less than *exactly* epsilon is flawed and should be discouraged in my opinion: it is distracting from what analysis is really saying. If you state: suppose X is less than d(>0) and then show Y is less than 2d^2 then that is more than adequate since obvisouly d can be chosen so that 2d^2<e whatever e(>0) is.
 
Last edited:

It is a very good theorem, in fact it predicts the solutions to every problem assigned in this section of my Calculus textbook, therefore I cannot dismiss it that easily.

Polynomial theorem:
\lim_{x \rightarrow a} f(x) = L
|f(x) - L| &lt; \epsilon \; \text{when} \; 0 &lt; |x - a| &lt; \delta
|a_1 x^n - L| &lt; \epsilon \; \text{when} \; 0 &lt; |x - a| &lt; \delta
\delta = \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}}
\text{Given} \; \epsilon &gt; 0 \; \text{let} \; \delta = \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}}
\text{If} \; 0 &lt; |x - a| &lt; \delta \; \text{when} \; |a_1 x^n - L| &lt; |a_1| \delta^n
\boxed{|a_1| \delta^n = |a_1| \left[ \left( \frac{\epsilon}{|a_1|} \right)^{\frac{1}{n}} \right]^n = \epsilon}

Any Calculus I students interested in disproving this polynomial theorem?
[/Color]
 
Last edited:
It isn't a polynomial theorem, is it? It only appears that you're saying 'it' 'works' if f(x)=ax^n, and you still seem to believe that x^n-k^n=(x-k)^n, which is a major flaw in the argument. You could at least try rewriting it so that it makes more sense. What is f(x)? What is a_1? Is it that f(x)=a_1x^n? Usually a_1 would be the coefficient of x in f(x), but even that is a guess since we don't know what f(x) is. What is the relationship between L, a, and a_1? Who knows? It is very unclear what yo'ure even trying to prove, and what you're assuming. As written your statement applies to f(x)=x, a=0 and L=1, so you reallyu can't be saying that as x tends to zero that f(x), which is just x, tends to 1. So there must be some more restrictions, such as a is a root of f(x)-L, mustn't you? Or are you assuming that f(x) tends to L as x tends to a? Then if so what are you proving and what are you proving it about? It's just completely impossible to decide what you're doing. Indeed I've no idea what the statement of the 'polynomial theorem' is.

Try writing:

Theorem: STATEMENT OF THEOREM

Proof: STATEMENT OF PROOF OF SAID THEOREM

The best we can do is say, if f(x)= k(x-a)^n then we can prove from first principles that it tends to zero as x tends to a since, given e>0 let d=(e/k)^{1/n} then |f(x)|<e when |x|<d, but that is truly not a very hard theorem, is it?
 
Last edited:
This is inviting, Matt

This is inviting, Matt :biggrin:

matt grime said:
Now, in general what happens? Well, we can't be so specific.

consider f(x)-L Now to prove this converges to zero as x tends to m then we are assuming x-m is a root of f(x)-L, or the f(x)-L=(x-m)g(x) for some polynomial. It so happens the cases you know are nice.

So how would we prove this actually converges to 0? The key is that g(x) is bounded near x=m. This bound *is* realizable in terms of the coefficients of g(x) which are realizable in terms of coefficients of f the number m. If you want to get a rigourous bound for this then you can use various tools such as:

|b_0 + b_1x +...+ b_rx^r|< |b_0| + |b_1||x| +...+ |b_r||x|^r

which in turn is less than rmax{|b_i|}max{1,|x|^r}

the first max is in terms of coefficients and the second can be worked out since we can assume we are looking for a d<1 so that |x-m|<1 or -m-1<x<m+1. Let us assume this bound is M, then d to be less than e/M and we're done.

But it isn't as nice in general as what you want.

Theorem All univariate polynomials with real coefficients are continuous.

pf: Let f(x)=\sum_{q=0}^{n}a_qx^q

Then \lim_{x\rightarrow k}f(x)=f(k) since

\lim_{x\rightarrow k}f(x)=f(k)\Leftrightarrow\forall \epsilon &gt;0,\exists \delta &gt;0 \mbox{ such that }|x-k|&lt;\delta \Rightarrow \left| f(x) - f(k)\right| &lt;\epsilon

\left| f(x) - f(k)\right| = \left| \sum_{q=0}^{n}a_qx^q - \sum_{q=0}^{n}a_qk^q\right| = \left| \sum_{q=1}^{n}a_q\left( x^q - k^q\right) \right| = \left| x - k\right| \left| \sum_{q=1}^{n}a_q\sum_{r=0}^{q-1} x^{q-r-1}k^{r} \right| \leq \left| x - k\right| \sum_{q=1}^{n} |a_q| \sum_{r=0}^{q-1} \left| x^{q-r-1} k^{r} \right|

to be continued...
 
  • #10
Why would you want to prove it like that? It suffices to show that x^n converges and we are done, and that is easy if messy to do rigorously. I've no idea what univariate means, and to be honest why would you specify over the reals? The same proof works over the complex numbers.
 
  • #11
univariate, as opposed to multivariate; of one variable opposed to many.
over the reals to appeal to our present audience.
 
Back
Top