# Derivatives of Standard Functions

• I
I can't help but feeling these days that I don't actually understand where most of the maths I use comes from. Unfortunately, I can't remember whether this is due to the fact that I didn't take my studies seriously until the end of undergrad, or rather that these things were never actually taught to me. One example of this is the derivative of elementary functions that I use regularly (things like trig functions, exponents, etc.). Lets take the exponential function. It's easy to show from the definition of a derivative that $$\frac{d a^x}{dx} = a^x \lim_{h\rightarrow 0}\frac{a^h -1}{h}$$ (at least for) $$a\ne0$$. However I don't know how to take that limit to complete the proof. So my concrete question is: can one actually complete this proof in a straightforward way, that is without other specialised knowledge of the function (e.g., its power series definition, relationship to logarithm, etc.) I would be in the same situation with all the common functions, and this situation case extends beyond calculus to other fields such as linear algebra. In linear algebra for example, it struck me from reading another thread here that to prove that matrices have at least one eigenvalue, one must use the fundamental algebra, whose proof I didn't see until I took algebraic topology. So my more general question is: do we (most of us?) typically learn mathematics procedurally/operationally and never really know the reasons why the things we do are actually valid? Or is it just me?

PS: I'm aware that I've interspersed a number of different questions and points here, I hope that doesn't cause umbrage with anyone!

## Answers and Replies

fresh_42
Mentor
You listed a lot of things which cannot be used. The result, however. is ##\log a##, so how are we allowed to use the logarithm?

You listed a lot of things which cannot be used. The result, however. is ##\log a##, so how are we allowed to use the logarithm?
Ok, so we might admit (or prove?) that exponential has an inverse and choose to call that the logarithm. What would we do next to complete the proof?

Infrared
Gold Member
The usual definition of ##e## is ##e=\lim_{t\to\infty}\left(1+\frac{1}{t}\right)^t.## So the limit you want to compute with ##a=e## is

##\lim_{h\to 0}\lim_{t\to\infty}\frac{\left(1+\frac{1}{t}\right)^{th}-1}{h}.## You can expand with the binomial theorem and take limits to get ##1.##

And once you have that ##\frac{d}{dx}e^x=e^x##, then in general ##\frac{d}{dx}a^x=\frac{d}{dx}e^{x\ln(a)}=\ln(a)a^x.##

etotheipi
fresh_42
Mentor
I would first prove ##\dfrac{d}{dx}e^x = e^x## and from that ##\dfrac{d}{dx}\log x =\dfrac{1}{x}## by the chain rule. At last we get
$$\dfrac{d}{dx} \log y = \dfrac{d}{dx}(x\log a)=\log a= \dfrac{\dfrac{d}{dx} y}{y} \Longrightarrow y'= y\log a = a^x\log a$$
This uses the arithmetic rules of the logarithm and that the exponential function solves ##y'=y\, , \,y(0)=1.##

The usual definition of ##e## is ##e=\lim_{t\to\infty}\left(1+\frac{1}{t}\right)^t.## So the limit you want to compute with ##a=e## is

##\lim_{h\to 0}\lim_{t\to\infty}\frac{\left(1+\frac{1}{t}\right)^{th}-1}{h}.## You can expand with the binomial theorem and take limits to get ##1.##

And then once you have that ##\frac{d}{dx}e^x=e^x##, then in general ##\frac{d}{dx}a^x=\frac{d}{dx}e^{x\ln(a)}=\ln(a)a^x.##
And where does that definition of e come from? If I wanted to do the proof from first principles, I would need to first prove that ##\lim_{t\to\infty}\left(1+\frac{1}{t}\right)^t## exists. I would then I would be happy to call it "e". Following that, I'm not sure about what you mean by expanding with a binomial theorem. Do you expand for integer or non-integer exponents ##th##?

I would first prove ##\dfrac{d}{dx}e^x = e^x##
I don't know how to prove that without invoking theorems I don't know the proof of. See my above reply to Infrared.

fresh_42
Mentor
Mathematics is a discipline where results are based on earlier results. If you do not allow earlier results, you will have a long way to go from Zermelo-Fraenkel and arithmetics' axioms to any kind of differentiation. Otherwise we will have to use something substantial.

dextercioby
Infrared
Gold Member
There is a binomial theorem valid for non-integer exponents: see https://proofwiki.org/wiki/Binomial_Theorem/General_Binomial_Theorem

It takes a little work to prove it, but it certainly doesn't rely on properties of the exponential function. Alternatively, you could pick integers ##n\leq th\leq n+1## and apply the standard Binomial theorem here and then bound. I'm sure this would work, but I guess it would be a little tedious.

To show that the limit exists, you want to show that ##(1+1/t)^t## is increasing in ##t## and bounded; this is a standard exercise.

Also, if you would be happy using a different definition of ##e##, choosing ##e:=\sum_{n=0}^\infty 1/n!## would probably make your life a bit easier (although it's not too hard to show these definitions are equivalent).

As far as I can tell from these responses, I was probably never actually taught via mathematical proof where these things come from. I imagine it would get even more difficult if I asked how to prove the derivative of sine and cosyne. But it's reassuring to see that, at least for the exponential function, its derivative can be derived using only high-school level algebra and the definition of a derivative (probably the proof that the limit definition of e exists requires a bit more advanced analysis, but that's ok).

fresh_42
Mentor
Same problem: What are ##\cos## and ##\sin##? There are so many ways to define them. Some are easier to differentiate and some are less.

To show that the limit exists, you want to show that ##(1+1/t)^t## is increasing in ##t## and bounded; this is a standard exercise.
Out of curiousity, how would you show that it is bounded? Would you use the generalised binomial theorem again?

Same problem: What are ##\cos## and ##\sin##? There are so many ways to define them. Some are easier to differentiate and some are less.
Just to be difficult, I'd define them as the ratio of the relevant two sides of a triangle. That's certainly how they get taught before we learn calculus.

fresh_42
Mentor
Just to be difficult, I'd define them as the ratio of the relevant two sides of a triangle. That's certainly how they get taught before we learn calculus.
The shortest way in this case is probably to use the unit circle in the complex plane and write them by Euler's formula in terms of the exponential function.

The shortest way in this case is probably to use the unit circle in the complex plane and write them by Euler's formula in terms of the exponential function.
Sure, although most proofs of Euler's formula use either the power series defintion of the exponential function or something similar https://en.wikipedia.org/wiki/Euler's_formula#Proofs. The proof using polar coordinates https://en.wikipedia.org/wiki/Euler's_formula#Using_polar_coordinates could be used if we allow for knowledge of how to differentiate the exponential function in the complex plane, which according to the above posts would require that we extend the binomial theorem to the complex plane. Otherwise, we could perhaps derive the power series definition of the exponential function as a corollary of the above proof sketches in this thread.

vela
Staff Emeritus
Homework Helper
I imagine it would get even more difficult if I asked how to prove the derivative of sine and cosyne.
This isn't too bad, actually. You can prove using the squeeze theorem that
$$\lim_{h \to 0}\frac{\sin h}{h} = 1.$$ Then it's straightforward to show that
$$\lim_{h \to 0}\frac{\cos h-1}{h} = 0.$$ Then it's just a matter of using the angle-addition formulas for sine and cosine and the definition of the derivative.

vela
Staff Emeritus
Homework Helper
Let's take the exponential function. It's easy to show from the definition of a derivative that $$\frac{d a^x}{dx} = a^x \lim_{h\rightarrow 0}\frac{a^h -1}{h}$$ (at least for) $$a\ne0$$. However I don't know how to take that limit to complete the proof.
My old calculus book defined
$$\log ⁡x = \int_1^x \frac{du}{u}.$$ Starting with this definition, you can prove ##\log ab = \log a + \log b## and ##\log a^b = b \log a## where ##b## is rational. From the fundamental theorem of calculus, it follows that ##(\log ⁡x)'= \frac 1x##.

Because the log function is one-to-one, there exists an inverse function ##\exp##, and it satisfies ##\exp \log x = x## and ##\log \exp x = x##. By differentiating the latter, it follows that ##(\exp x)' = \exp x##.

Then for ##a>0## and rational ##b##, we have ##a^b = \exp(\log a^b) = \exp(b \log a)##. Up to this point, the book had defined ##a^b## only for rational values of ##b##. Since the right-hand side is defined for all values of ##b##, this relationship gives us an obvious way to define ##a^b## for all values of ##b##. The derivative of ##a^x## then follows from the established properties of ##\exp## and the chain rule.

Finally, defining ##e## to be the value such that ##\log e = 1##, it follows that ##\exp x = e^x##.

Stephen Tashi