Homework Help: Three Problems on continuity.

1. Oct 13, 2012

mtayab1994

1. The problem statement, all variables and given/known data

1- Let f be a continuous function for all real numbers such that :

$$\lim_{x\rightarrow+\infty}f(x)=L$$ and $$\lim_{x\rightarrow-\infty}f(x)=L'$$

and that LL'≤0. Prove that f equals 0 at some point C in ℝ.

2- Let f be a continuous function on [a,b] such that for every (x,x') in ([a,b])^2 and x≠x:

$$|f(x)-f(x')|<k|x-x'|$$ .Prove that the equation f(x)=x has only one solution on [a,b].

3-Let f and g be continuous functions on [0,1] such that for every x in [0,1]: f(x)<g(x).

Prove that there exists a number m>0 such that for every x in [0,1]: f(x)+m<g(x).

3. The attempt at a solution

1- I know that since LL'<0 so that means that L>L' or L<L' . So the Intermediate value theorem states that there exists a number c such that f(c)=0, but i know how I'm going to show that.

2-I think I am supposed to use the definition of a limit to solve it but i don't know where to start.

3- I have no idea how to start this one. Any help would be very appreciated.

2. Oct 13, 2012

HallsofIvy

That is only saying "L is not equal to L'" which says nothing about their relation to 0. What you meant to say, I believe, was "either L> 0> L' or L'> 0> L".

IF the intermediate theorem states that, then that would "show" it. But it doesn't. The intermediate value theorem says that if f(a)> 0 and f(b)< 0 there exist x between a and b such that f(x)= 0. What you are given are limits, not values of the function at specific points.

What you can say is that, because $\lim_{x\to\infty} f(x)= L$, if L> 0, there exist $x_0$ such that if $x> x_0$ then f(x)> L-1. Do the same with L' and x going to negative infinity.

Take a guess and give it a try. See what you learn from trying.

3. Oct 13, 2012

mtayab1994

Ok for the negative infinity i can say that if L<0 then there exists x1 such that x>x1 then
f(x)>L-1. Right?

4. Oct 14, 2012

mtayab1994

Okay for number 3 I did a proof by contradiction and I got:

Let h(x)=f(x)-g(x) and we know that since g(x)>f(x) then h(x)=f(x)-g(x)<0. Now we have to prove that h(x)<0 for all x in [0.1]. Suppose that there is a point C in [0,1] such that f(c)>g(c) implying that h(c)=f(c)-g(c)>0 and that's a contradiction because h(x)<0 for every x in [0,1]. But does this imply that f(x)+m<g(x) for every x in [0,1].

Last edited: Oct 14, 2012
5. Oct 14, 2012

HallsofIvy

For 3 use the fact that every continuous function attains both maximum and minimum values on a closed and bounded interval.

6. Oct 14, 2012

mtayab1994

For number 2 i said that let g(x)=f(x)-x be a continuous function on the interval I=[a,b] as a difference of 2 continuous equations. and for every x in I f(x) is also in I.

And f(I)=[m,M] so therefore f(a) is in I and f(b) is in I.

That implies that m≤f(a)≤M and m≤f(b)≤M and that implies that f(a)-m≥0 and f(b)-M≤0 and that implies that f(a)*f(b)≤0.

Hence the IVT states that there exists an x in I such that f(x)=x . But i think that i have to show that f is monotone increasing or monotone decreasing for this proof to hold. Am I correct??