Why is the Derivative of e^x e^x?

  • Thread starter Thread starter Aki
  • Start date Start date
  • Tags Tags
    Derivative E^x
Click For Summary
The derivative of e^x is e^x itself because it satisfies the differential equation f' = f, which uniquely characterizes exponential functions. This property arises from the definition of the derivative and the exponential function's power series representation. Additionally, the only functions whose derivatives equal themselves are of the form ce^x, where c is a constant, making e^x the unique solution under the initial condition f(0) = 1. The discussion also highlights the relationship between the areas under curves and the behavior of e^x in calculus. Understanding these concepts is fundamental in calculus and differential equations.
  • #31
the only functions which equal their own derivatives are the functions of form ce^x, with c constant. this is proved above in several ways, all ultimately relying on the mean value theorem.

If one also assumes that f(0) = 1, then the only such function is e^x, as also observed above.

If one asks where the constant function f = 0, fits in here, it is ce^x with c = 0.
 
Physics news on Phys.org
  • #32
Aki said:
I don't understand why the derivative of e^x is e^x itself.

Hello Aki,

It's been awhile since I did this myself, so I am going to do it again, for you as well as me. The goal is to find a power series in one unknown X, if there is one, whose derivative is equal to itself.

If there is such a power series, we can then reserve the special symbolism e^x for the series. So in otherwords, if you can find a power series whose derivative with respect to X is equal to the original undifferentiated power series, you have the reason that:

\frac{d}{dx} e^x = e^x

The answer would then be verbalized as saying, "There happens to be an analytic function whose derivative is equal to itself, and there is only one such function. Mankind is denoting this function using the symbolism e^x, but in order to find the function we must look for its power series expansion.

Discussion

Suppose that some function f(x) can be expressed as a power series.

f(x) = C_0 + C_1x + C_2x^2 + C_3x^3 + ...

We introduce summation notation to simplify the RHS of the equation above.

f(x) = \sum_{n=0}^{n= \infty} C_n x^n

Now, take the derivative with respect to x of both sides of the equation above to obtain:

f^\prime (x) = \sum_{n=0}^{n= \infty} n C_n x^{n-1}

Which is none other than

df/dx = C_1 + 2C_2x + 3C_3x^2 + ... = \sum_{n=1}^{n= \infty} n C_n x^{n-1} = \sum_{n=0}^{n= \infty} (n+1) C_{n+1} x^n<br />

Now, take the derivative with respect to x again, to obtain:

f^\prime^\prime (x) = \sum_{n=0}^{n= \infty} n(n-1) C_n x^{n-2}

Which is none other than

(d^2f/dx^2) = f^\prime^\prime (x) = \sum_{n=2}^{n= \infty} n(n-1) C_n x^{n-2} = \sum_{n=0}^{n= \infty} (n+2)(n+1) C_{n+2} x^{n}

Now, take the derivative with respect to x again, to obtain

f^\prime^\prime^\prime (x) = \sum_{n=0}^{n= \infty} n(n-1)(n-2) C_n x^{n-3}

Which is equivalent to:

f^\prime^\prime^\prime (x) = \sum_{n=3}^{n= \infty} n(n-1)(n-2) C_n x^{n-3}= \sum_{n=0}^{n= \infty} (n+3)(n+2)(n+1) C_{n+3} x^{n}

At this point, you have enough information to find a simple formula for the unknown constants in terms of n. Look at the formula for the third derivative of f(x). Suppose that x =0 in that formula. You thus have:

f^\prime^\prime^\prime (0) = \sum_{n=0}^{n= \infty} (n+3)(n+2)(n+1) C_{n+3} 0^{n}

Keeping in mind that 0^0 =1, 0^1 =0, 0^2 =0, etc you should now see that you can deduce that:

f^\prime^\prime^\prime (0) = 3*2*1*C_3

And if you check the second derivative of f(x), evaluated at x=0, you will find that:

f^\prime^\prime (0) = 2*1*C_2

And if you check the first derivative of f(x), evaluated at x=0, you will find that:

f^\prime(0) = 1*C_1

And of course

f(0) = C_0

Thus, the we have the following formula for C_n

C_n = \frac{f^n(0)}{n!}

Where

n! = n(n-1)(n-2)(n-3)...*2*1

Now, we can now re-write the formula for f(x), using the formula for Cn. We have:

f(x) = \sum_{n=0}^{n= \infty} C_n x^n = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n

We now have only to answer the question, "is there or isn't there a power series whose derivative is equal to itself." Suppose that there is. Let f(x) denote the series. Therefore we must have:

f(x) = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n = f^\prime (x) = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1} <br /> <br />

So, if there is a function f(x), whose derivative with respect to x is equal to f(x), then the power series expansion of f(x) must be such that:

\sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} x^n = \sum_{n=0}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1} = \sum_{n=1}^{n= \infty} \frac{f^n(0)}{n!} n x^{n-1} = \sum_{n=0}^{n= \infty} \frac{f^{n+1}(0)}{n!} x^n<br /> <br />

Thus, we can see that the statement above is true, provided that:

f^n(0) = f^{n+1}(0)<br /> <br />

Thus, the constraint for there to be a function f(x) that is equivalent to its derviative, is that its nth derivative evaluated at the point x=0 must equal its( n+1)th derivative evaluated at x=0. This isn't enlightening enough, so let us write things more explicitely. We have to have:

\frac{d^nf}{dx^n}|_{x=0} = \frac{d^{n+1}f}{dx^n}|_{x=0}

Now, since we are looking for a power series expression for x, we have:

f(x) = C_0 + C_1x + C_2x^2 + C_3x^3 + C_4x^4+ ...
f^\prime(x) = C_1 + 2C_2x + 3C_3x^2 + 4C_4x^3+ ...
f^{\prime\prime}(x) = 2C_2 + 3*2C_3x + 4*3C_4x^2+ ...
f^{\prime\prime\prime}(x) = 3*2C_3 + 4*3*2C_4x+ ...
f^{\prime\prime\prime\prime}(x) = 4*3*2C_4+ ...

and so on.

And when the expressions above are evaluated at x=0, only the first term in any of them remains, and that first term is a constant. And we know that the constraint dicates that the n+1th derivative evaluated at x=0 must equal the nth derivative evaluated at x=0, hence the constraint is:

C_0 = C_1 = 2C_2 = 3*2C_3 = 4*3*2*1C_4 and so on.

Suppose that c_0 =1. In that case, we must have:

1 = C_1 = 2C_2 = 3*2C_3 = 4*3*2*1C_4

So that the formula we need for c(n) in order to have a power series which is equal to its derivative is:

C_n = \frac{1}{n!}

Becase it will now follow that:

1 = C_1 = 2/2! = 3*2*1/3! = 4*3*2*1/4! =5*4*3*2*1/5!

And the above equations are all true.

Thus, a power series which is equal to its own derivative is:

\sum_{n=0}^{n= \infty} \frac{x^n}{n!} = 1+x+x^2/2!+x^3/3!+x^4/4!

Which we can check directly:

\frac{d}{dx} \sum_{n=0}^{n= \infty} \frac{x^n}{n!} = \sum_{n=0}^{n= \infty} n \frac{x^{n-1}}{n!} = \sum_{n=1}^{n= \infty} n \frac{x^{n-1}}{n!} = \sum_{n=1}^{n= \infty} \frac{x^{n-1}}{(n-1)!} <br /> <br /> =\sum_{n=0}^{n= \infty} \frac{x^n}{n!} <br /> <br />

QED

And this is not the only function which is equal to its own derivative since multiplication of it, by any constant, will also be equal to its own derivative.

Let us thus define e^x as follows:

e^x = \sum_{n=0}^{n= \infty} \frac{x^n}{n!}

It will follow that e^x = \frac{de^x}{dx}

It will also follow that for any constant B we have:

Be^x = \frac{dBe^x}{dx}

And hence there is a class of functions (not just one as I said in the beginning) that are equal to their own derivative. The class of functions are given by:

f(x) = B\sum_{n=0}^{n= \infty} \frac{x^n}{n!}

Where B is an arbitrary constant.

Regards,

Guru
 
Last edited:
  • #33
this is very beautiful, and no doubt guided the early fathers of calculus. as such it is likely helpful here as well.

however, it does not quite respond to the question by modern standards of proof. for one thing it assumes your power series defines a differentiable function, to get existence of a solution of f' = f, this way.

one can of course prove this, e.g. by showing the local uniform convergence of both the series and the derived series. you still have then a little way to go to relate this series to the exponential function you somewhat cavalierly redefine.

i.e. then you should prove that any solution of the equation f' = f, and f(0) = 1, does satisfy the relation f(x+y) = f(x)f(y), which then does make it an exponential function.


this is not hard as follows: let a be given and consider the function g(x) = f(a+x)/f(x).

then g'(x) = [f(a+x)f(x) - f(a+x)f(x)]/f^2(x) = 0, so f(a+x)/f(x) = a constant, which for x=0, equals f(a), so f(a+x) = f(x)f(a). at least whenever f(x) ≠ 0.

Aha! But both sides of the equation are continuous, and hence it suffices to show that they are equakl on a dense set. But since by your definition, f is an analytic function, its set of zeroes is isolated, hence nowhere dense. Thus the equation holds everywhere, and then it follows that f(x) is never zero, or else it would always be zero, and yet we have assumed that f(0) = 1.
 
  • #34
wow Physicsguru, thank you, I'm sure you put lots of time in answering my question. Thanks
 
  • #35
remember aki also to ask yourself, when does a series actually define a function? when does it define a differentiable function? if so, can the derivative be obtained by formal differentiation of the series?

but i admit these questions about series maybe did not much concern the early and great mathematicians like euler, who pioneered their use.
 
  • #36
I'd like to point out the domain of this problem is assumed to be the field of real or complex numbers. It need not be so in a more general case.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
612
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
5
Views
3K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K