# I'm trying to understand why e works this way

1. Apr 16, 2012

### Nile3

Hello, I just learned that the derivative of e^x is itself and I was trying a few things to get my head around this. I mean, it's easy to memorize since there's only one such function, but if you look at the amount that it takes for the function to take the value of the tangent you will discover that it is about 0.69314718055994530942 above the number itself. At 0, the value is 1, when will the value of y be 2? 0.69315... Ok so the question is, why?

Also, do you know any good book or references to learn a little bit more about e and why it is such a peculiar constant? Could it be that it is our system of rules in mathematics that produce these interesting results or could they exist in any system of logic?

Thank you very much.

-Francis

2. Apr 16, 2012

e is a number originally defined as 1+(1/1!)+(1/2!)+(1/3!)+(1/4!)...∞. When you find the derivative of a^x then you get it as a^x*logea.When you find the derivative of a^x using first principle then you find the limith→0 (ah-1)/h.This limit on solving using binomial theorem for a^h you get the same series as logex where we have defined e.
When you see the derivative of e^x as e^x then it means that slope of tangent at that point is same as y-coordinate of that point in the curve e^x.
For books you may try Thomas Finny, I.A.Maron for one variable Calculus, Introduction to calculus and analysis by Richard Courant.

3. Apr 16, 2012

### lavinia

Not sure what you mean by this.

4. Apr 16, 2012

### Nile3

I think I phrased it improperly. If you look at the amount it takes on the x axis for the value to double, it is consistently 0.69314... I was wondering what was special about this number.

5. Apr 16, 2012

### SteveL27

The Google knows all. ln(2) = .69etc

In other words, when does e^x = 2? Taking the natural log of both sides gives x = ln(2), which is .69etc

Last edited: Apr 16, 2012
6. Apr 16, 2012

### victor.raum

Niles3,

What book on calculus are you reading that fails to explain why $e^x' = e^x$? Any decent book really ought to give at least some basic reasoning.

7. Apr 17, 2012

### Fredrik

Staff Emeritus
I like to think of it this way: Is there a differentiable function f such that f(x+y)=f(x)f(y) for all x,y? If we try to answer this question, we will find that there are infinitely many such functions, and that exactly one of them is such that f'(0)=1. We call this function the exponential function.

We denote the exponential function by exp, and its inverse by log (or ln). Then we use these functions to define ax for arbitrary a,x. (When x is an integer, we can just define $a^x=a\cdot a\cdots a$, but when x is not an integer, we need another definition to make sense of ax). We define $a^x=\exp(x\log a)$. And finally, we define the number e by e=exp(1), and note that $e^x=\exp(x\log e)=\exp(x\log(\exp(1)))=\exp(x)$.

Note that a function f that satisfies the conditions mentioned above also satisfies f'(x)=f(x)f'(0)=f(x) for all x. (Differentiate f(x+y)=f(x)f(y) with respect to y, and then set y=0 and use the assumption f'(0)=1).

Last edited: Apr 17, 2012
8. Apr 17, 2012

### Robert1986

That was a WONDERFUL explanation.

This should be included in EVERY calculus book, though sadly it isn't. It wasn't until analysis that I really understood what was going on.

9. Apr 17, 2012

### HallsofIvy

Many modern Calculus texts start by defining ln(x) by
$$ln(x)= \int_1^x \frac{1}{t}dt$$

You can then derive the various properties of the logarithm, ln(ab)= ln(a)+ ln(b), ln(a^b)= bln(a), etc. from that. In particular, one can show that ln(x) maps the positive real numbers one-to-one onto the set of all real numbers and so has an inverse function that maps the set of all real numbers onto the set of all positive numbers. Define exp(x) to be that inverse function.

One can then use the properties of the ln function to derive the corresponding properties of the exponential function. In particular, since ln(exp(x)= x we have, by the chain rule, that
$$\frac{1}{exp(x)}\frac{d exp(x)}{dx}= 1$$
so that
$$\frac{d exp(x)}{dx}= exp(x)$$.

Another important property is this- since ln(x) is continuous and differentiable for all positive x, we can apply the mean value theorem to the interval [1, 2]. That says that
$$\frac{ln(2)- ln(1)}{2- 1}= (ln(c))'= 1/c$$
for some c between 1 and 2. Since ln(1)= 0 and 2- 1= 1, that says that

If y= exp(x), then x= ln(y) so that, if x is not 0, $1= (1/x)ln(y)= ln(y^{1/x})$ and, going back to the exponential, $y^{1/x}= exp(1)$ so that $y= (exp(1))^x$. That is, the inverse function to ln really is a specific number to the x power. We can then define e to be exp(1) (the number y such that ln(y)= 1) and have $y= e^x$ as that inverse function.

10. Apr 17, 2012

### Fredrik

Staff Emeritus
Thank you. I'm glad you liked it. I will provide the details in this post.

Suppose that f(x+y)=f(x)f(y) for all x,y. This assumption implies that for all x, f(x)=f(x)f(0). If f≠0, this implies that there's a y such that f(y)≠0 and f(y)=f(y)f(0). So if f≠0, then f(0)=1. (It might seem that f(0)=0 is also possible, but this would imply that for all x, f(x)=f(x)f(0)=0, and this contradicts f≠0).

Suppose further that f≠0. The above implies that f(0)=1. If there's a y such that f(y)=0, then for all such y, and all x, f(x+y)=f(x)f(y)=0. This would imply that f(x)=0 for all x, which contradicts the assumption that f≠0. So there is no y such that f(y)=0. In other words, for all x, f(x)≠0.

Suppose further that f is differentiable. Our assumptions imply that f'(x)=f(x)f'(0) for all x. (Differentiate f(x+y)=f(x)f(y) with respect to y, and then set y=0).

Let t be an arbitrary real number and define a function g by g(x)=f(tx) for all x. Clearly, g≠0. By the chain rule, g is differentiable, and g'(x)=f'(tx)t for all x. For all x,y, g(x+y)=f(t(x+y))=f(tx+ty)=f(tx)+f(ty)=g(x)+g(y). So g satisfies all the conditions we have imposed on f. Since t is an arbitrary real number, this means that if there's one function that satisfies those conditions, there are infinitely many. However, if t≠1, then f'(0)≠g'(0). This means that we can narrow our search by only considering functions with a specified value of its first derivative at 0.

Suppose further that f'(0)=1. Then f'(x)=f(x)f'(0)=f(x) for all x. This implies that for all x,
$$\lim_{h\to 0}\frac{f'(x+h)-f'(x)}{h}=\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}.$$ So f' is differentiable, and f''=f'. The latter result implies that f''(0)=f'(0)=1. Similarly, for all positive integers n, if the nth partial derivative of f exists at x, then so does the (n+1)th, and $f^{(n+1)}=f^{(n)}$. By induction, this implies that f is smooth (i.e. that the partial derivatives of f up to arbitrary order all exist).

This implies that f can be expressed as a Taylor series around 0: For all x,
$$f(x)=\sum_{n=0}^\infty \frac{f^{(n)}(0)}{n!}x^n =\sum_{n=0}^\infty \frac{x^n}{n!}.$$ We can also obtain this result without using Taylor's formula. We just need to know that the result that f is smooth implies that f can be expressed as a power series, and that the coefficients in a power series are unique. For all x,
\begin{align} f(x) &=\sum_{n=0}^\infty a_n x^n\\ f'(x) &=\sum_{n=1}^\infty a_n n x^{n-1} =\sum_{n=0}^\infty a_{n+1} (n+1)x^n. \end{align} This implies that for all non-negative integers n, we have an+1=an/(n+1)!. Since f'(0)=a0, this formula gives us a1=a0/1!=f'(0)=1, a2=a1/2!=1/2!, and so on.

Note that we have only proved that if there's a non-zero differentiable f such that f(x+y)=f(x)f(y) for all x,y, and f'(0)=1, then that f is given by the power series above. This result implies that there's at most one f with those properties. We still need to prove that such an f exists. A good way to do that to first prove that $\sum_{n=0}^\infty\frac{x^n}{n!}$ is convergent for all x. Then we define f by $f(x)=\sum_{n=0}^\infty\frac{x^n}{n!}$ for all x, and prove that the f defined this way is non-zero, differentiable, and such that f(x+y)=f(x)f(y) for all x,y.

I won't do that right now, but I might return to it later if I'm not too busy.

Edit: What HallsofIvy suggested is another good way to prove existence. It seems to avoid having to deal with convergence of series, but it requires that you understand Riemann integrals.

Last edited: Apr 17, 2012
11. Apr 17, 2012

### Whovian

One more quick thing, using the original definition of $e$ in most algebra textbooks, $\displaystyle\lim_{h\to0}\left(\left(1+h\right)^{1/h}\right)$. Note that by putting this into our definition of the derivative, we get $\displaystyle e^x\cdot \lim_{h\to0}\left(\dfrac{\left(1+h\right)^{h\cdot 1/h}-1}{h}\right)$, which, it's quite easy to prove, is $e^x$.

12. Apr 17, 2012

### lavinia

I found it instructive to define the exponential function as the inverse of the log.

from the equation

x = $\int^{f(x)}_{1}$1/xdx and the Chain Rule one gets

f'(x)/f(x) = 1 so f'(x) = f(x) and by construction, f(0) = 1.

f(1) is defined to be the number,e. It is the number that makes the area under 1/x equal to 1.

The same method can be used to show that the exponential map is a homomorphism from the additive real numbers to the multiplicative positive real numbers, that is

e$^{x + y}$ = e$^{x}$e$^{y}$

So .... from above

x + y = $\int^{f(x + y)}_{1}$1/xdx

But the derivative of

$\int^{f(xy)}_{1}$1/xdx equals 1 so it is equal to x + a constant. Inspection tells you that this constant is

$\int^{f(y)}_{1}$1/xdx

BTW: The same technique can be used to derive the properties of the natural logarithm using the integral

log(x) = $\int^{x}_{1}$1/xdx

For instance the derivative of the log(xy) = 1/x so log(xy) = log(x) + a constant ... and so on

Last edited: Apr 17, 2012
13. Apr 17, 2012

### hotvette

14. Apr 18, 2012

### lavinia

I apologize for a notational error. f(xy) should read f(x)f(y)

15. Apr 19, 2012

### binomial

Let y = ln(ex)

Therefore, $\frac{dy}{dx}$ = $\frac{d}{dx}$[ex] / ex

And since y = ln(ex) = x, dy/dx = 1, and multiplying both sides by ex gives you ex = $\frac{d}{dx}$[ex]

Hope this simple proof helps.

16. Apr 20, 2012

### Whovian

Binomial, one thing. That proof relies on $\displaystyle\log'\left(x\right)=\dfrac1x$. How would you get that? (Most calculus textbooks would rely on that being the definition of log, but still, do you rely on a definition or do you derive that?)

17. Apr 20, 2012

### binomial

There are proofs in virtually every math textbook that derive that derivative. I just assumed he was okay with the derivative of ln(x) being 1/x. Although you are right. The proof I gave does rely on an understanding of the derivative of ln(x). :)

18. Apr 20, 2012

### Fredrik

Staff Emeritus
If you choose an approach like the one suggested by HallsofIvy, which defines the logarithms first, then it makes sense to do it that way. If you define the exponential function first, as in my approach, then it doesn't. I would find the derivative of log as
$$\log'(x)=\exp^{-1}{}'(x)=\frac{1}{\exp{}'(\exp^{-1}(x))} =\frac{1}{\exp(\log x)} =\frac{1}{x}.$$ This requires that you already know that exp'=exp.

19. Apr 20, 2012

### lavinia

The definition is just for convenience. Just find the inverse of the function that is the integral from 1 to x of 1/x. the chain rule shows that the derivative of this function is itself - just as Binomial wrote.

20. Apr 20, 2012

### mathwonk

Everyone loves this topic and I am no exception. Thus I try to supplement the excellent answers already given by another similar one. These are notes from my honors freshman calculus course. They cover the integral approach, axiomatic approach, and Taylor series approaches, and equate them all.

Apparently I have already attached these notes to my who wants to be a... thread. I'll see if I can find them there.