# Prove: Aka, need help

1. Mar 22, 2008

### Math_Geek

1. The problem statement, all variables and given/known data
Prove: If f is defined on the reals and continuous at x=0, and if f(x1+x2)=f(x1)+f(x2) for all x1,x2 in the reals, then f is continuous at all x in the reals.

2. Relevant equations

Using defn of limits and continuity

3. The attempt at a solution
is this like proving that the sum of two functions is continuous? I am a bit confused, this is the last on of the homework, then I can enjoy my easter.

2. Mar 22, 2008

### morphism

No. f is a function that satisfies the functional equation f(x_1 + x_2) = f(x_1) + f(x_2). You're told that f is continuous at 0, and you're supposed to use this to conclude that f is continuous everywhere.

3. Mar 22, 2008

### Math_Geek

so show since f is continuous at 0, there exists an epsilion >0 such that and deal with the two functions?

4. Mar 22, 2008

### morphism

There's just one function here: f. You want to prove that it's continuous (on all of R).

5. Mar 23, 2008

### Math_Geek

ok so how does the f(x1)+f(x2) play into it?

6. Mar 23, 2008

### morphism

You also know something else. f is continuous at 0.

Now use the definition of continuity and these two facts. You might also find it helpful to prove that f(-x) = -f(x) [hint: f(0)=0].

7. Mar 23, 2008

### tiny-tim

Hint: what is f(x + epsilon) - f(x)?

8. Mar 23, 2008

### HallsofIvy

Staff Emeritus
$$\lim_{x\rightarrow a} f(x)= \lim_{h\rightarrow 0}f(a+ h)$$
where h= x- a. Then use the fact that f(a+ h)= f(a)+ f(h).