- #1
semidevil
- 157
- 2
let I be an interval in R, let f: I --> R and let c belong to I. suppose there exists a constant K and L such that |f(x) - L <= K|x-c| for x in I. show that the limit as x --> c of f(x) = L.
I"m looking at this, but I don't know how to start. From what I know, this is very very familiar. the sentence ..."suppose there exists a constant K and L such that |f(x) - L| <= K|x-c|," doesn't this pretty much imply the statement?
while I am doing practice problems, I always see this form and as soon as I find a delta and K, it pretty much assumes the answer. I feel that I am close, but I don't know how to go about it.
maybe, let epsilon > 0, then we find a bound for |x - c|, and once we find K, this means |f(x) - L <= K|x-c|. then, we choose delta := inf(delta, 1/k*epsilon),
then, if 0 < | x - c | < delta, it is proved..
close? way off?
how about this one:
lim as x to c, of root(x) = root(c) for c >0.
how would I do that?
can I square both sides? and have | x - c| < e?
I"m looking at this, but I don't know how to start. From what I know, this is very very familiar. the sentence ..."suppose there exists a constant K and L such that |f(x) - L| <= K|x-c|," doesn't this pretty much imply the statement?
while I am doing practice problems, I always see this form and as soon as I find a delta and K, it pretty much assumes the answer. I feel that I am close, but I don't know how to go about it.
maybe, let epsilon > 0, then we find a bound for |x - c|, and once we find K, this means |f(x) - L <= K|x-c|. then, we choose delta := inf(delta, 1/k*epsilon),
then, if 0 < | x - c | < delta, it is proved..
close? way off?
how about this one:
lim as x to c, of root(x) = root(c) for c >0.
how would I do that?
can I square both sides? and have | x - c| < e?