# Exp(x) is its own derivative (proven axiomatically)

1. Dec 16, 2006

### de1irious

How do you show that the exponential function is its own derivative by using the fact that E(x)E(y)=E(x+y). Don't assume the derivative exists either. You can use any other property of E(x) that you can think of, but you are supposed to use the fact above primarily.
(that is, without using the obvious power series expansion argument)

I think it must be something so obvious that I am missing it altogether! Thanks for the help.

Oh, and while you're at it, how would I then show that E(x) is convex?

Last edited: Dec 16, 2006
2. Dec 16, 2006

Start with the definition of a derivative, that should give you an idea.

3. Dec 16, 2006

### StatusX

Note that for any positive number a, a^(x+y)=(a^x)(a^y), but only for a=e does this function equal its derivative. So you'll need to use one of the definitions of e. The definition as the limit of (1+1/n)^n should work nicely for this problem.

4. Dec 16, 2006

### quasar987

Regarding your second question, there's a caracterisation that says "f differentiable is convex <==> f ' is monotically increasing"

5. Dec 17, 2006

### dextercioby

If $E(x)E(y)=E(x+y)$, than, by taking x=y, it follows $[E(x)]^{2}=E(2x)$. Differentiate wrt "x" and then get

$$2 E(x) E'(x) = 2\frac{dE(2x)}{d(2x)}$$

which means

$$E(x) E'(x) = \frac{dE(2x)}{d(2x)}$$
$$E(x)E(x) =E(2x)$$

Assume E(x) not equal 0 for any x in R. Divide the 2 eqns and then you get that

$$\frac{1}{E(x)} \frac{dE(x)}{dx} = \frac{1}{E(2x)} \frac{dE(2x)}{d(2x)} = C$$

, where C is an arbitrary real constant. From the double equality above it follows that

$$\frac{dE(x)}{dx} =C E(x)$$

qed.

Daniel.

6. Dec 17, 2006

From the definition of integral (in the form of a series) and the property:

$$f(x)f(y)=f(x+y)$$ you get...that the integral of f(x)=exp(x)

$$\sum_{n=0}^{\infty}f(x+nh)h=f(x)\sum_{n=0}^{\infty}f(nh)h=f(x)\sum_{n=0}^{\infty}[f(h)]^{n} h=f(x)\frac{h}{f(h)-1}$$

so.. the "integral" of the exponential would be the proper exponential multiplied by a constant of value $$\frac{h}{f(h)-1}$$ where h is a very small quanity...:tongue: then if a function and its integral are equal but a constant we conclude that the derivative would be the function multiplied by another constant.

7. Dec 17, 2006

### Fredrik

Staff Emeritus
I'm sure I'm missing something really simple here, but it doesn't seem trivial that a result of the form "g(x)=g(2x) for all x" implies that g is constant. Of course if I try a specific form of g, like a second degree polynomial, I will find that all coefficients are zero except the constant term, but that doesn't prove that the conclusion holds if h is allowed to be something other than a polynomial. So how do you show that this g must be constant?

8. Dec 18, 2006

### dextercioby

Take the derivative wrt "x" of g(x)=g(2x).

Daniel.

9. Dec 18, 2006

### t!m

Why the 2 on the RHS if you're differentiating wrt '2x'? ...assuming I'm following your notation correctly?

10. Dec 18, 2006

### cristo

Staff Emeritus
Chain rule:
$$\frac{dE(2x)}{dx}=\frac{dE(2x)}{d(2x)}\frac{d(2x)}{dx}=2\frac{dE(2x)}{d(2x)}$$

(He states he's differentiating wrt "x" not "2x." You may have missed that.)

11. Dec 18, 2006

### mathwonk

here is one approach i like:

Theorem: If f(x) is a continuous function defined on all reals such that
1) f(0) = 1,
2) f(x+y) = f(x) f(y), for all reals x,y,
then f(1) = a > 0, and f(x) = ax for all real x.

then define g(x) as the integral from t=1 to t=x of dt/t. it follows g is differentiable with derivative 1/x, that g(1)=0 and with a little argument using MVT that g(xy) = g(x)+g(y). thus the inverse of g is a differentiable exponential function f with derivative f' = f.

If one defines e as the number such that g(e) = 1, then one gets that f(1)= e, so f(x) = e^x.

12. Dec 18, 2006

### arildno

Apart from mathwonk's reply, I feel that most of the posters ASSUMES continuity, and in some cases differentiability of the exponential.

Personally, I would like to show that there indeed EXISTS a continuous function satisfying the functional equation given by mathwonk, and then work out from there.

However, the simplest way to do this is perhaps to use the power series representation disallowed at the outset..

13. Dec 18, 2006

### mathwonk

my appproach of course is to show there exists an invertible differentiable function satisfying the inverse functional equation g(xy) = g(x)+g(y). it then follows from the inverse function theorem that there also exists a differentiable function satisfying the exponential functional equation. so my post is at least logically self contained, and does not assume either existence of the exponential or differentiability.

Dieudonne, proves there isa unique continuoua function g such that g(xy) = g(x)+g(y) and g(a) = 1, if a>0 and a is not 1. then he takes the inverse function of g as the exponential function. this gives the functinoal equation but not the differentiability.

from the functional equation he then derives diiferentiability using an integral representation of a^x. i.e. if h(x) = the integral from 0 to x of a^x, he uses the functional equation and change of variables, to deduce that h(x) which is differentiable, is a constNT MULTIPLE OF a^x, which is thus also differentiable.

14. Dec 18, 2006

### mathwonk

given a positive number a>1 say, one can define a^n for all integers n, and then define a^(n/m) as the mth root of a^n. then for arbitrary x one can take a^x as the least upper bound of all numbers of form a^(n/m) where
n/m < x.

then this definition amkes a^x continuos from below, and increasing. it should follow that it is actually continuous, with some work, and satisfies the functional equation, by con tinuity, since it does on rational arguments.

then dieudonne's trick gives differentiability i guess. so this probably does it as requested.

15. Dec 18, 2006

### Fredrik

Staff Emeritus
It still looks very non-trivial to me. I defined g(x)=E'(x)/E(x). You showed that g(x)=g(2x) for all x. OK, let's take the derivative with respect to x. Now we know that g'(x)=2g'(2x) for all x, but how does that help us? I don't see it.

However, I realized something when I was thinking about it. It seems plausible that a function g that isn't continuous can satisfy these conditions, so we probably have to use that g is continuous everywhere to show that g is a constant.

If I'm right, we have to assume that E' exists everywhere and is continous everywhere. But even with those assumptions, I don't see how to proceed.

16. Dec 21, 2006

### HallsofIvy

For a any positive numbers, ax+h= axah so
ax+h- ax= axah- ax= ax(ah-1).

Then $a^{x+h}- a^x= a^x(a^h- 1)$ so that
$$(e^x)'= \lim_{h\leftarrow 0}\frac{a^{x+h}-a^x}{h}= a^x\left(\lim_{h\rightarrow 0}\frac{a^h- 1}{h}\right)$$

If you prove that
$$\lim_{h\rightarrow 0}\frac{a^h- 1}{h}$$
exists, then you have proved that the derivative of ax is a constant times ax. Define "e" to be the value of a such that that constant is 1.