- #1

- 157

- 1

## Homework Statement

Suppose the functions

*f*and

*g*have the following property: for all

*E > 0*and all

*x*,

if

*0 < |x - 2| < sin((E^2)/9) + E*, then

*|f(x) - 2| < E*,

if

*0 < |x - 2| < E^2*, then

*|g(x) - 4| < E*.

For each

*E > 0*, find a

*d > 0*such that, for all

*x*,

i) if

*0 < |x - 2| < d*, then

*|f(x) + g(x) - 6| < E*.

## Homework Equations

N/A, I think.

## The Attempt at a Solution

Well, what I did was look at

*|f(x) + g(x) - 6| < E*. Since I was given

*|f(x) - 2| < E*and

*|g(x) - 4| < E*, the best strategy seemed to be to change

*d*so that it would produce values that would be, for each expression involving

*f(x)*and

*g(x)*would be less than

*E/2*. However, since I don't actually know what

*f(x)*and

*g(x)*

**are**, I'm at a loss as to how to do that.

Spivak's solution (since this problem comes from there, ch. 5 #6), says the same thing ("we need...

*< E/2*") but then says that this means I need:

*0 < |x - 2| < min(sin(E^2/36)^2 + E/2, E^2/4) = d*

...the logic of which escapes me.

Last edited: