Find limit L. Then find δ > 0 such that |f(x) - L| < 0.01 (f(x) = x^2 -3)

  • Thread starter Thread starter ialink
  • Start date Start date
  • Tags Tags
    Limit
ialink
Messages
24
Reaction score
0

Homework Statement


Find limit L. Then find δ > 0 such that |f(x) - L| < 0.01

Limit when x approaches 2 of x^2 -3

Homework Equations


0 < |x-c| < δ
0 < |f(x) - L| < ε

The Attempt at a Solution


function is continuous at D = ℝ so limit is f(2) = 1

0 < |x-2| < δ and

0 < |x^2 -3 -1| < 0.01 ⇔ 0 < |x^2 -4| <0.01 ⇔ 0 < |x + 2|*|x-2| < 0.01 which is good because:
0 < |x-2| = 0.01 / |x+2|

I've seen the solution and i see that I'm supposed to assume a range for x (like (1,3) ). I can imagine that because the function isn't linear a range has to be assumed.

They say that assuming this range gives δ = 0.01 / 5 = 0.002 witch seems to be the smallest of 0.01 / 3 and 0.01 / 5. That makes sense to me.

But why this chosen range? Why does this range apply to ε = 0.01? Choosing a different range gives a different δ.

Who can help me with this I'm really trying to understand this.
 
Physics news on Phys.org
hi ialink! :smile:
ialink said:
Find limit L. Then find δ > 0 such that |x - L| < δ => |f(x) - L| < 0.01

They say that assuming this range gives δ = 0.01 / 5 = 0.002 witch seems to be the smallest of 0.01 / 3 and 0.01 / 5. That makes sense to me.

But why this chosen range? Why does this range apply to ε = 0.01? Choosing a different range gives a different δ.

exactly! :smile:

δ depends on ε

(we could write it δ(ε) )

these proofs all involve showing that whatever ε we choose, we can always find a δ :wink:

(usually δ gets smaller and smaller, just like ε)
 
Hey Tiny Tim

If i Would express δ(ε) I'd say δ(ε) = ε / |x+2|. Calculating δ with this function is possible for D = ℝ (for both ε and x) so a logical conclusion is that the limit exists that's what you mean right?

but ε = 0.01 is given to find the appropriate δ. Something has to be assumed for x in δ(ε) = ε / |x+2|. They assume (1,3) and find δ = 1/5*ε or δ = 1/3*ε and therefore conclude that δ = 1/5 * 0.01 = 0.002. That i don't get. What's the relation between the chosen range (1,3) an ε = 0.01? Why is that valid?
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top