# Functional Equation

## Homework Statement

Suppose a function satisfies the conditions
1. f(x+y) = (f(x)+f(y))/(1+f(x)f(y)) for all real x & y
2. f '(0)=1.
3. -1<f(x)<1 for all real x
Show that the function is increasing throughout its domain. Then find the value:
Limitx -> Infinity f(x)x

## The Attempt at a Solution

I proceed by putting x,y=0 in eq 1.
I get the following roots for f(0)={-1,0,1}
But if I take f(0)={-1,1}, f(x) will become a constant function and will be equal to +1 when f(0)=1 and -1 when f(0)=-1, thereby violating condition 3
So f(0)=0

From equation 1: I assume 'y' as a constant and differentiate wrt x
f ' (x+y)=(f ' (x)(1-f2(y))) / (1+f(x)f(y))2
I put x=0;
I get f ' (y)=1-f2(y) Using condition 3; I prove that the derivative is always positive.
I have been able able to solve the first part of the question. But I couldn't evaluate the limit

Last edited:

It may be helpful to "cheat" and use the fact that f(x) is really tanh x, to figure out what to do. Then go back and do it without using that fact.

First show lim f(y)=1 as y approaches infinity.

After that, then you find your limit, which has indeterminate form 1^infty, by using natural log and l'Hopital, just like you would do if you knew f was tanh. Unfortunately with f, you don't have all the trig identities at your disposal. Take a stab at it and ask again if you get stuck.

Dick