- #1
- 580
- 0
Homework Statement
Suppose a function satisfies the conditions
1. f(x+y) = (f(x)+f(y))/(1+f(x)f(y)) for all real x & y
2. f '(0)=1.
3. -1<f(x)<1 for all real x
Show that the function is increasing throughout its domain. Then find the value:
Limitx -> Infinity f(x)x
The Attempt at a Solution
I proceed by putting x,y=0 in eq 1.
I get the following roots for f(0)={-1,0,1}
But if I take f(0)={-1,1}, f(x) will become a constant function and will be equal to +1 when f(0)=1 and -1 when f(0)=-1, thereby violating condition 3
So f(0)=0
From equation 1: I assume 'y' as a constant and differentiate wrt x
f ' (x+y)=(f ' (x)(1-f2(y))) / (1+f(x)f(y))2
I put x=0;
I get f ' (y)=1-f2(y) Using condition 3; I prove that the derivative is always positive.
I have been able able to solve the first part of the question. But I couldn't evaluate the limit
Limitx -> Infinity f(x)x. Please help me on the limit part.
Last edited: