let I be an interval in R, let f: I --> R and let c belong to I. suppose there exists a constant K and L such that |f(x) - L <= K|x-c| for x in I. show that the limit as x --> c of f(x) = L.(adsbygoogle = window.adsbygoogle || []).push({});

I"m looking at this, but I dont know how to start. From what I know, this is very very familiar. the sentence ...."suppose there exists a constant K and L such that |f(x) - L| <= K|x-c|," doesn't this pretty much imply the statement?

while I am doing practice problems, I always see this form and as soon as I find a delta and K, it pretty much assumes the answer. I feel that I am close, but I dont know how to go about it.

maybe, let epsilon > 0, then we find a bound for |x - c|, and once we find K, this means |f(x) - L <= K|x-c|. then, we choose delta := inf(delta, 1/k*epsilon),

then, if 0 < | x - c | < delta, it is proved..

close? way off?

how about this one:

lim as x to c, of root(x) = root(c) for c >0.

how would I do that?

can I square both sides? and have | x - c| < e?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# 3 questions prove them

**Physics Forums | Science Articles, Homework Help, Discussion**