Composition goes to 0? f(x)/x -> 0 and g(x) -> 0 prove g(f(x))/x->0

1. Nov 25, 2009

brian44

I've been looking at different math books that have analysis problems to get more perspectives on how to approach various analysis problems. I was following along in the "Derivatives" section of the book "Mathematical Thinking" by D'Angelo and West, second edition, and arrived at a lemma with corresponding proof that doesn't make sense to me.

16.5 Lemma: Suppose e is an error function (any function such that $$lim_{h \rightarrow 0}\frac{e(h)}{h} = 0$$ ). If $$s(h) \rightarrow 0$$ then the composition $$e \circ s$$ is an error function.

The proof given goes like: for every $$\epsilon > 0$$ we can choose $$\delta > 0$$ such that $$|t| < \delta \Rightarrow |e(t)| \le |t| \epsilon$$. Therefore $$|e(s(h))| \le |s(h)| \epsilon$$ for $$|s(h)| < \delta$$. Since $$s(h) \rightarrow 0$$ we can choose $$\delta '$$ s.t. $$|h| < \delta ' \Rightarrow |s(h)| < \delta$$ and hence $$|h| < \delta ' \Rightarrow |e(s(h))/h| < \epsilon$$. This proves that $$(e \circ s)(h) / h \rightarrow 0. \blacksquare$$

QUESTION:
However, I don't follow the last step. It seems to me that we only have $$|h| < \delta ' \Rightarrow |e(s(h))/s(h)| \le \epsilon$$, how can we say this implies $$|e(s(h))/h| < \epsilon$$ ? The only way this is true would be if $$h > s(h)$$ but as s(h) is some function of h, I don't see how this could be true universally.

I would greatly appreciate help, it's bugging me I can't figure it out.

Maybe the proof is wrong and the error was overlooked, if so is there another way to prove it?

Thanks,
Brian

2. Nov 25, 2009

l'Hôpital

$$|h| < \delta \Rightarrow |e(s(h))/h| < \epsilon$$

Note that,

$$|h| < \delta \Rightarrow |e(s(h))/h | = | e (\delta) / \delta | < \epsilon$$

Which of course implies what was to be proven.

3. Nov 25, 2009

brian44

I don't see why this is so...

$$|h| < \delta \Rightarrow |e(h)/h | < \epsilon$$

for that specific h, it does not imply $$|e(h)/d | < \epsilon$$ for any $$d < \delta$$, obviously this is false because we could make d arbitrarily small while fixing e(h). Similarly, $$|h| < \delta$$ and $$|s(h)| < \delta$$ in no way implies they are equal, e.g. define $$s(h) = h/2 \Rightarrow s(h) < \delta$$ if $$h < \delta$$. In fact, even if f(x) goes to 0, this does not imply $$f(x) \le x$$ for small x, consider srqt(x) on [0,infinity). sqrt(x) goes to 0 as x goes to 0, but sqrt(x) > x for x < 1 !

I might be misunderstanding what you are trying to say, in any case I fail to see your point.

Note the problem I was having with understanding the proof is due to this implication:

$$|h| < \delta ' \Rightarrow |e(s(h))/h| < \epsilon$$

I don't see why it holds, obviously if it holds I see the proof is finished.

4. Nov 25, 2009

brian44

An idea that may work:

for a particular h (with abs. val. < delta and delta'), if |s(h)| <= |h| then obviously $$|e(s(h))/h| \le |e(s(h))/s(h)| < \epsilon$$. On the other hand, if |s(h)| > |h|, then because e(x) -> 0 as x->0 if we can say e(h) > e(s(h)) then we can say $$|e(s(h))/h| \le |e(h)/h| < \epsilon$$. However, I am not sure this holds.

5. Nov 26, 2009

brian44

I realized my previous example (using sqrt(x)) gives a counter-example which disproves the given Lemma, i.e. the book is WRONG. I wasted all that time for a stupid error in a book... They probably meant to say s(h) is also an error function, than it would hold, just going to 0 isn't enough.

Counter-example: take e(h) = h^2 , s(h) = sqrt(h) (or sqrt(|h|) on reals). Then e(h)/h = h ->0 as h -> 0. sqrt(h) -> 0 as h -> 0. But, e(s(h)) = h (or |h| on reals), so that e(s(h))/h -> 1 (or does not exist on reals) as h->0 => e(s(h)) is not an error function.