# Introduction / Summary of Differentiation

• Hootenanny
In summary, the conversation discusses the concept of differentiation and its applications in finding derivatives, gradients, and tangents of functions. It also introduces various rules for differentiating functions, such as the power rule, constant coefficient rule, and sum rule. The conversation assumes that the reader has some knowledge of limits and how to implement them.
Hootenanny
Staff Emeritus
Gold Member
As the title suggests this thread is intended as a summary of differentiation, it is by no means an attempt at an exhaustive discussion of the topic. A detailed knowledge of limits is not required but is useful, it is assumed that the reader has some knowledge of limits and how to implement them. As always I welcome comments and corrections either here of via PM.

The Definition of a Derivative

The derivative ($f'(x)$) of a function $f(x)$ with respect to $x$ is given by the limit;

$$f'(x)=\lim_{h\to0}\;\frac{f(x+h)-f(x)}{h}$$

Provided the limit exists. If the limit exist at a point x0 we say that the function is differentiable at this point.

As can be seen from the plots below, as $h$ approaches zero, the secants of the curve tend to the tangent at a point x, this gives the gradient of the curve at point x (provided the function is differentiable at point x). Note however, that in the above limit we cannot simply set $h$ as zero directly as our limit would $\to\pm\infty$ and thus, the limit would not exist. We must therefore manipulate the limit into a form where this does not occur.

Images taken from Wikipedia

Example
Find the derivative of $f(x)= x^2+2$
Solution;
$$f'(x)=\lim_{h\to0}\;\frac{f(x+h)-f(x)}{h}\;\;\rightarrow \lim_{h\to0}\;\frac{(x+h)^2+2-(x^2+2)}{h}$$
$$=\lim_{h\to0}\;\frac{\not{x^2}+h^2+2hx +\not{2}-\not{x^2}-\not{2}}{h}= lim_{h\to0}\;\frac{\not{h}(h+2x)}{\not{h}}$$
$$\therefore f'(x)=2x$$​

Finding the Gradient and Tangent to a Curve at a Point
We already now that the derivative of a function gives the gradient of a curve at any point that is differentiable. This allows us to find the gradient of the function at any point x where the curve is differentiable. This then allows us to find the equation of the tangent to the curve at any point x where the function is differentiable.

Example
Find the equation of the tangent to the curve with equation $y=f(x)=x^2+2$ at x=3.
Solution;
We know that the tangent is a straight line and therefore has an equation of the form $y=mx+c$. We have already found the derivative above which gives the gradient (m) of any point x on the curve y=f(x) (provided f(x) is differentiable at x). Therefore we can say that;

$$m=\left. \frac{dy}{dx}\;\;\right|_{x=3}=2\cdot3=6$$

We can find the corresponding y value to x=3 using the original equation $y=x^2+2$;
$$y=3^2+2=11$$
We can then use this information to find the equation of the tangent at point $(x,y)=(3,11)$;
$$y=mx+c \rightarrow 11=3\cdot 6 + C \Rightarrow C = -7$$
$\therefore$ the equation of the tangent to the curve $y=x^2+2$ at the point $(x,y)=(3,11)$ is $y=6x-7$.​

Last edited by a moderator:
Rules of Differentiation

Using the definition of a derivative every time we wish to differentiate a function becomes tedious after a while. It is therefore desirable to derive a series of rules which allows us to differentiate a variety of functions without having to refer to the above definition.

1. Derivative of a Constant Function
If $f$ has a constant value $f(x)=c$, then;

$$f'(x)=\lim_{h\to0}\frac{f(x+h)-f(x)}{h}=\lim_{h\to0}\frac{c-c}{h}=0$$

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;(c) = 0}}}$$

2. Power Rule for Positive Integers
If $f(x)=x^{n}$ and $n\in\mathbb{Z},\;n>0$ then;

$$f'(x)=\lim_{h\to0}\frac{f(x+h)-f(x)}{h}=\lim_{h\to0}\frac{(x+h)^{n}-x^{n}}{h}$$
Since n is a positive integer we can use the binomial theorem to expand the parentheses;

$$=\lim_{h\to0}\frac{\left\{\not{x^n} + nx^{-1}\cdot h + \frac{n(n-1)}{2!}\cdot x^{n-2}h^{2} + \ldots + nxh^{n-1}+h^{n}\right\}-\not{x^{n}}}{h}$$

$$=\lim_{h\to0}\frac{nx^{-1}\cdot h + \frac{n(n-1)}{2!}\cdot x^{n-2}h^{2} + \ldots + nxh^{n-1}+h^{n}}{h}$$

One h from each term in the numerator will cancel with the h is the denominator leaving;

$$=\lim_{h\to0}\; nx^{-1} + \frac{n(n-1)}{2!}\cdot x^{n-2}h + \ldots + nxh^{n-2}+h^{n-1}$$

As $h\to0$ all terms except the first drop out leaving;

$$f'(x)=nx^{n-1}$$

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;n^{n} = nx^{n-1}\;\;\; n\in\mathbb{Z},\;n>0}}}$$

Note that this can be shown to be true for all integers (which will be done later).

3. Constant Coefficient Rule
If $f$ is a differentiable function of $x$ and $c$ is constant then;

$$f'(x)=\lim_{h\to0}\frac{cf(x+h)-cf(x)}{h} \stackrel{Limit\;Property}{=} c\cdot\lim_{h\to0}\frac{f(x+h)-f(x)}{h}$$

Since the RHS of the equality is the definition of the derivative we can say that;

$${\color{red}\boxed{{\color{black}\frac{d}{dx}\;(cf(x)) = c\cdot\frac{df}{dx}}}}$$

4. The Derivative of a Sum
If we take two functions of $x$; $u$ and $v$, both of which are differentiable then;

$$\frac{d}{dx} \left(u(x)+v(x)\right) = \lim_{h\to0}\frac{(u(x+h)+v(x+h))-(u(x)+v(x))}{h}$$

We can then split this into two fractions;

$$=\lim_{h\to0}\left\{ \frac{u(x+h)-u(x)}{h}+\frac{v(x+h)-v(x)}{h}\right\}$$

$$=\lim_{h\to0}\frac{u(x+h)-u(x)}{h} + \lim_{h\to0}\frac{v(x+h)-v(x)}{h}\right\} = \frac{du}{dx}+\frac{dv}{dx}$$

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;(u+v)=\frac{du}{dx}+\frac{dv}{dx}}}}$$

The sum rule can be shown to hold for differences (i.e. $(u-v)$) by combining the sum rule with the constant multiple rule and setting $c=-1$.

Last edited:
5. Product Rule
Suppose we have two functions of $x$ - $u(x)$ and $v(x)$ both differentiable at $x$ - and we wish to find the derivative of their product, then from our definition of the derivative we can write;

$$\frac{d}{dx}\;(u\dot v) = \lim_{h\to0}\frac{u(x+h)\cdot v(x+h)-u(x)\cdot v(x)}{h}$$

The next step may appear nonsensical initially but is necessary so we can evaluate the limits of $u$ and $v$ separately. The next step is to add and then subtract $u(x+h)\cdot v(x)$ from the numerator (the additions are highlighted in blue);

$$=\lim_{h\to0}\frac{u(x+h)\cdot v(x+h) {\color{blue}-u(x+h)\cdot v(x) +u(x+h)\cdot v(x)} -u(x)\cdot v(x)}{h}$$
$$=\lim_{h\to0}\left\{ \frac{u(x+h)\cdot v(x+h) {\color{blue}-u(x+h)\cdot v(x)}}{h}+\frac{{\color{blue}u(x+h)\cdot v(x)} - u(x)\cdot v(x)}{h}\right\}$$

We can now take a factor of $u(x+h)$ from the first ratio and a factor of $v(x)$ from the second ratio, giving;

$$=\lim_{h\to0}\left\{ u(x+h)\cdot\frac{v(x+h)-v(x)}{h}+v(x)\cdot\frac{u(x+h)-u(x)}{h}\right\} = \lim_{h\to0}\; u(x+h)\cdot \lim_{h\to0}\frac{v(x+h)-v(x)}{h}+\lim_{h\to0}\frac{u(x+h)-u(x)}{h}$$

Since we stated that both $u$ and $v$ are differentiable at $x$ this implies that both are continuous* at $x$, therefore as $h$ approaches zero, $u(x+h)$ approaches $u(x)$. Therefore, the above limits become what is known as the product rule;

$${\color{red}\boxed{{\color{black}\frac{d}{dx}\;(u\cdot v)= u\cdot\frac{dv}{dx}+v\cdot\frac{du}{dx}}}}$$

Example
Find;
$$\frac{d}{dx}\left\{ (x^4+4)\cdot (x^2-1)\right\}$$

Solution;
Let $u(x):=x^4+4$ and $v(x):=x^2-1$. First we must find $u'(x)$ and $v'(x)$ for this we can use the power and the constant rule (I will not show all the steps here since the application of the rules are simple) which gives;

$$\frac{du}{dx} = 4x^3 \;\;\;\;\;\;\;\;\ \frac{dv}{dx} = 2x$$

Therefore, using the product rule we obtain;

$$\frac{d}{dx} (x^4+4)\cdot (x^2-1) =u\frac{dv}{dx}+v\frac{du}{dx}= (x^4+4)\cdot 2x + (x^2-1)\cdot 4x^2 = 2(x^5 + 2x^4 -2x^2 + 4x)$$
$$\hline$$
*To prove this would require a discussion of limits, which I have already said I will not do here. If you wish to see a proof of this, then I will be happy to oblige via PM but a knowledge of limits is required beforehand.

Last edited:
6. The Quotient Rule
Suppose we again have two functions of $x$ ($u(x)$ and $v(x)$), both of which are differentiable at $x$ and $v(x)\neq0$. What if we wish to find the derivative of the ratio of the two functions?

$$\frac{d}{dx}\left(\frac{u(x)}{v(x)}\right) = \lim_{h\to0}\frac{\frac{u(x+h)}{v(x+h)}-\frac{u(x)}{v(x)}}{h}$$

$$=\lim_{h\to0}\frac{v(x)\cdot u(x+h) - u(x)\cdot v(x+h)}{h\cdot v(x+h)\cdot v(x)}$$

Again, here we must take the seemingly nonsensical step of adding and subtracting $u(x)\cdot v(x)$ from the numerator. However, this will allow us to write two difference ratios for the derivatives of $u(x)$ and $v(x)$ (again this step is highlighted in blue);

$$=\lim_{h\to0}\frac{v(x)\cdot u(x+h) {\color{blue}-u(x)\cdot v(x) + u(x)\cdot v(x)} -u(x)\cdot v(x+h)}{h\cdot v(x+h)\cdot v(x)}$$

$$=\lim_{h\to0}\frac{v(x)\cdot{\color{red}\frac{u(x+h)-u(x)}{h}}-u(x)\cdot{\color{red}\frac{v(x+h)-v(x)}{h}}}{v(x+h)\cdot v(x)}$$

Notice that the two ratios highlighted in red are the derivative of the functions $u(x)$ and $v(x)$ respectively. Notice also that as $h\to0$, then $v(x+h)\cdot v(x) \to (v(x))^2$. Therefore when we evaluate our limit we obtain;

$${\color{red}\boxed{{\color{black}\frac{d}{dx}\left(\frac{u}{v}\right)=\frac{v\frac{du}{dx}-u\frac{du}{dx}}{v^2}}}}$$

Example
Find;
$$\frac{d}{dx}\left(\frac{x^2+1}{x^2-1}\right)$$

Solution;
Let $u(x)=x^2+1$ and $v(x)=x^2-1$, then,

$$\frac{du}{dx}=2x \;\;\;\;\;\; \frac{dv}{dx}=2x\;\;\;\;\;\; (v(x))^2 = (x^2-1)^2$$

$$\Rightarrow \frac{d}{dx}\frac{x^2+1}{x^2-1} = \frac{2x\cdot(x^2+1) - 2x\cdot(x^2-1)}{(x^2-1)^2} = \frac{4x^3+4x}{(x^2-1)}$$

Last edited:
7. Power Rule for Negative Integers
Suppose we have $x^n$, where $n \in \mathbb{Z},\; n<0$ and $n=-m$, where $m \in \mathbb{Z},\; n>0$. Hence we can say that;

$$x^n=x^{-m}=\frac{1}{x^m}$$

Therefore, using the quotient rule (and positive power rule) we can say that;

$$\frac{d}{dx}\;x^n = \frac{d}{dx}\;\frac{1}{x^m} = \frac{x^m\cdot\frac{d}{dx}(1) - 1\cdot\frac{d}{dx}\;x^m}{(x^m)^2}$$

$$=\frac{0-mx^{m-1}}{x^{2m}} = -mx^{-m-1} \stackrel{n=-m}{=}nx^{n-1}$$

This result is identical to the power rule for positive integers, we can therefore generalise the power rule thus;

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;n^{ n} = nx^{n-1}\;\;\; n\in\mathbb{Z}}}}$$

I do not think an example is necessary here.

Can you continue proving the Power rule for all real numbers using logarithmic differentiation.

minase said:
Can you continue proving the Power rule for all real numbers using logarithmic differentiation.
Hi Minase,
Thank you for your comments. I am planning to discuss the definition and differentiation of logarithmic and exponential functions later in the tutorial. Prior to this I will consider the chain rule, implicit differentiation and differentiation of trigonometric functions.

8. The Chain Rule

Note; This proof requires knowledge of Linear Approximations and Differentials. For those who have not met linearization or differentials before, I will provide a external link to a 'rough proof' later and you may skip the following section.

$$\hrule$$​

The chain rule is used to find the derivative of a composite function (in other words a function of a function).

Provided $f(x)$ is differentiable at $\delta$, then we can show that the respective change in $f$ as $\delta$ changes to $\delta+\Delta x$ is given by;

$$\underbrace{f(x+\delta)-f(x)}_{\text{Change in }f}=\underbrace{f'(\delta)}_{\text{approximation}}+\underbrace{\epsilon\Delta x}_{\text{error}}$$
Where $\Delta x \to 0\Rightarrow \epsilon\to0$.

Let $u=g(x)$ be differentiable at $x$ and $f(u)$ be differentiable at $g(x)$. Also, let $\Delta x, \Delta y, \Delta u$ be the respective changes in $x, y, x$. Then from the above equation form the following system;

$$\Delta u = g'(x)\Delta x + \epsilon_{0}\Delta x$$
$$\Delta y = f'(u)\Delta u + \epsilon_{1}\Delta u$$

Substituting $\Delta u$ into $\Delta y$ gives;

$$\Delta y = f'(u)\left\{ g'(x)\Delta x + \epsilon_{0}\Delta x\right\} + \epsilon_{1}\left\{ g'(x)\Delta x + \epsilon_{0}\Delta x\right\}$$

Expanding the parenthesis;

$$\Delta y = f'(u)g'(x)\Delta x + f'(u)\epsilon_{0}\Delta x + \epsilon_{1}g'(x)\Delta x + \epsilon_{1}\epsilon_{0}\Delta x$$

Dividing throughout by $\Delta x$;

$$\frac{\Delta y}{\Delta x} = f'(u)g'(x) + f'(u)\epsilon_{0} + \epsilon_{1}g'(x) + \epsilon_{1}\epsilon_{0}$$

Since we want to find the derivative at a point we let $\Delta x\to0$ (recall that $\epsilon\to0$ as $\Delta x\to0$), thus obtaining;

$$\lim_{\Delta x\to0}\;\left( f'(u)g'(x) + f'(u)\epsilon_{0} + \epsilon_{1}g'(x) + \epsilon_{1}\epsilon_{0}\right)$$

$$=f'(u)g'(x) \stackrel{u=g(x)}{=} f'(g(x))\cdot g(x)[/itex] [tex]\therefore{\color{red}\boxed{{\color{black}\frac{dy}{dx}\; f\circ g(x) = \frac{dy}{dx}\; f(g(x)) = f'(g(x))\cdot g'(x) = \frac{dy}{du}\cdot\frac{du}{dx} }}}$$

$${\color{blue}\hrule}$$

http://web.mit.edu/wwmath/calculus/differentiation/chain-proof.html"
from World Web Math, MIT
This 'proof' is intended for those who have not met differentials and linearization previously, so that they can see that the chain rule has some grounding. This should by no means be considered a correct formal proof.

$${\color{blue}\hrule}$$​

Example;
Find;
$$\frac{d}{dx}\;(2x^2+x)^3$$

Solution;
Using Leibniz's notation, let $u=2x^2+x$, therefore we have;

$$\frac{d}{dx}\;(2x^2+x)^3 = \left\{ \frac{d}{du} u^3 \right\} \cdot \left\{ \frac{d}{dx} (2x^2+x) \right\}$$

$$=3\cdot u^2 \cdot (4x+1) = 3\cdot(2x^2+x)^2\cdot(4x+1)$$
$$\hrule[/itex] Note that the chain rule can be used multiple times to find a derivative. Last edited by a moderator: A correction in post number four. The formula should read; [tex]{\color{red}\boxed{{\color{black}\frac{d}{dx}\left (\frac{u}{v}\right)=\frac{v\frac{du}{dx}-u\frac{dv}{dx}}{v^2}}}}$$

As an aside it can also be expressed, perhaps more succinctly, thus;

$${\color{red}\boxed{{\color{black}\frac{d}{dx}\left (\frac{u}{v}\right)=\frac{v\cdot u' - u\cdot v'}{v^2}}}}$$

Implicit Differentiation
Up until this point we have always considered functions which can be written explicitly in the form of $y=ax^n + bx^{n-1}...$, these are known an explicit functions. However, some functions or equations can not be written explicitly, instead they must be define in terms of a relationship between two variables, these are known as implicit functions. An example of an implicit function would be that of a unit circle; $x^2 + y^2 =1$. On occasions it may be possible to write such equations in an explicit form, however, it is often preferable to differentiate implicit functions implicitly as they are.

When we differentiate implicitly we differentiate the implicit function with respect to the desired variable (x for example); we then treat other variables (y for example) as unknown functions of x and differentiate them accordingly using the chain rule. There is no formula to learn here (except the chain rule obviously), it is simply a technique you must practise.

Example;
Find an expression for the gradient of the unit circle at any given point (x,y)

Solution;

The equation for the unit circle in Cartesian coordinates is given by $x^2 + y^2 =1$ . Now, to find the gradient we must differentiate both sides of the equation with respect to x;

$$\frac{d}{dx}\; x^2 + y^2 = \frac{d}{dx}\;1$$

Now, the derivative of x2 is a simple application of the chain rule; the derivative of 1 is again a simple application of the constant rule, so that leaves us with;

$$2x + \frac{d}{dx}\; y^2 = 0$$

Now, since y is some (unknown) function of x; we can apply the chain rule here. (Refresh your knowledge of the chain rule if required) We therefore obtain;

$$2x + y\cdot\frac{dy}{dx} = 0$$

$$\therefore \;\;\; \frac{dy}{dx} = -\frac{2x}{y}$$

A typo in my previous post. The penultimate line in the example should read as follows;

$$2x + {\color{red}2}\cdot y\cdot\frac{dy}{dx} = 0$$

And therefore the final line will read;

$$\therefore \;\;\; m = \frac{dy}{dx} = -\frac{x}{y}$$

Apologies for any confusion.

In the example given for the quotient rule the answer should be, I hope:-

$$\frac{d}{dx}\left(\frac{x^2+1}{x^2-1}\right)$$

$$\Rightarrow \frac{d}{dx}\frac{x^2+1}{x^2-1} = \frac{2x\cdot(x^2-1) - 2x\cdot(x^2+1)}{(x^2-1)^2} = -\frac{4x}{(x^2-1)^2}$$

You have no idea how long I puzzled over your answer before I realized that it must be wrong.

Last edited:
Hootenanny said:
Note however, that in the above limit we cannot simply set $h$ as zero directly as our limit would $\to\pm\infty$ and thus, the limit would not exist.
this bit is bugging me for a few reasons.

firstly as setting h = 0 would make numerator as well as denominator zero, we've no need for the intuitive idea that the expression becomes infinity.

secondly the bolded limit might be better replaced with expression and the arrow removed, as setting to zero, rather than limiting, makes it no longer a limit and means ideas like +/- infinity are out of place

an explanation of why the differential is undefined at h = 0 and the need for a limit as a consequence of this might be better

Last edited:
i saw a reply to my point immediately above, but then it was deleted.

?

kesh said:
this bit is bugging me for a few reasons.

firstly as setting h = 0 would make numerator as well as denominator zero, we've no need for the intuitive idea that the expression becomes infinity.

secondly the bolded limit might be better replaced with expression and the arrow removed, as setting to zero, rather than limiting, makes it no longer a limit and means ideas like +/- infinity are out of place

an explanation of why the differential is undefined at h = 0 and the need for a limit as a consequence of this might be better
Thank you for your incisive comments, you make a valid point. I intend, perhaps sometime after Christmas, to release a 'final' or corrected version of the tutorial in pdf format with additional examples. My intention with posting the 'initial' version on the forums would, as you have done comment on the work and suggest improvements and identify flaws. I thank you again for your useful comments.

P.S. I removed my previous post as I felt that I didn't give your post sufficient attention.

Schrodinger's Dog said:
In the example given for the quotient rule the answer should be, I hope:-

$$\frac{d}{dx}\left(\frac{x^2+1}{x^2-1}\right)$$

$$\Rightarrow \frac{d}{dx}\frac{x^2+1}{x^2-1} = \frac{2x\cdot(x^2-1) - 2x\cdot(x^2+1)}{(x^2-1)^2} = -\frac{4x}{(x^2-1)^2}$$

You have no idea how long I puzzled over your answer before I realized that it must be wrong.
You are of course correct Schrodinger, thank you for pointing the error out. I should stop composing post so late in the evening and start using Maple to check my solutions

Hootenanny said:
Thank you for your incisive comments, you make a valid point. I intend, perhaps sometime after Christmas, to release a 'final' or corrected version of the tutorial in pdf format with additional examples. My intention with posting the 'initial' version on the forums would, as you have done comment on the work and suggest improvements and identify flaws. I thank you again for your useful comments.

P.S. I removed my previous post as I felt that I didn't give your post sufficient attention.
i thought it might be something like that. with all the fuss about 0/0 in the blogsphere i thought it might be nice to show off an area where it's dealt with usefully and rigorously, namely differentiation :)

Differentiation of Trigonometric Functions
Trigonometric functions has many important applications in science, especially physics. Therefore, it is important that we are able to differentiate equations involving trigonometric functions. Here I intend to illustrate how to differentiate trigonometric functions from first principles and present a series of 'rules' which I would recommend committing to memory if you are going to be dealing with trigonometric functions on a regular basis.

1. Derivative of Sine
For no particular reason I will begin with a derivative of a sine function. From out definition of the derivative we have;

$$f'(x) = \lim_{h\to0}\frac{f(x+h)-f(x)}{h}$$

$$f(x):=\sin(x) \Rightarrow f'(x) = \lim_{h\to0}\frac{\sin(x+h)-\sin(x)}{h}$$

Applying the angle sum identity ${\color{blue}\sin(A+B) = \sin(A)\cos(B)+\cos(A)\sin(B)}$, we obtain;

$$f'(x) = \lim_{h\to0} \frac{\left( \sin(x)\cos(h)+\cos(x)\sin(h) \right)-\sin(x)}{h}$$

Taking a factor of $\sin(x)$ out we have;

$$f'(x) = \lim_{h\to0} \frac{\sin(x)\left( \cos(h) - 1 \right) + \cos(x)\sin(h)}{h}$$

Splitting the limit into two separate limits and rewriting the fractions;

$$f'(x) = \lim_{h\to0} \left( \sin(x)\cdot\frac{\cos(h) - 1}{h} \right) + \lim_{h\to0} \left( \cos(x)\cdot\frac{\sin(h)}{h} \right)$$

$$f'(x) = \sin(x)\cdot\lim_{h\to0} \left(\frac{\cos(h) - 1}{h} \right) + \cos(x) \cdot\lim_{h\to0} \left(\frac{\sin(h)}{h} \right)$$

I state without proof that the following is true. I may prove them at a later time, but for the moment they should be accepted as truths. ($\theta$ in radians)

$${\color{blue}\boxed{\hspace{1cm}(1)\;\;\lim_{\theta\to0}\frac{\sin\theta}{\theta}= 1 \hspace{1cm}(2)\;\;\lim_{h\to0}\frac{\cos(h)-1}{h}\hspace{1cm}}}$$
Note that (2) follows directly from (1)

$$\therefore f'(x) = \sin(x)\cdot0 + \cos(x)\cdot1$$

$${\color{red}\boxed{{\color{black}f'(x) = \cos(x)}}}$$

As has been alluded to previously, the derivative of a function gives the gradient of that function at any point where that function is differentiable. Therefore, the cosine function represents the rate of change of the sine function. This http://faraday.physics.utoronto.ca/PVB/Harrison/Flash/TrigDiff/TrigDiff.html" for bringing this resource to my attention.

Last edited by a moderator:
2. Derivative of Cosine
From the definition of the derivative we have;

$$f(x):=\cos(x)$$

$$f'(x) = \lim_{h\to0}\frac{\cos(x+h) - \cos(x)}{h}$$

Invoking the angle sum formula; $\color{blue}\cos(A+B) = \cos(A)\cos(h) - \sin(A)\sin(B)$; (and splitting the two limits) we obtain;

$$f'(x) = \lim_{h\to0}\left(\frac{\cos(x)\cos(h)-\cos(x)}{h}\right)-\lim_{h\to0}\left(\frac{\sin(x)\sin(h)}{h}\right)$$

$$= \cos(x)\cdot\lim_{h\to0}\left(\frac{\cos(h)-1}{h}\right)-\sin(x)\cdot\lim_{h\to0}\left(\frac{\sin(h)}{h}\right)$$

As stated above*;

$${\color{blue}\boxed{\hspace{1cm}(1)\;\;\lim_{\theta\to0}\frac{\sin\theta}{\theta}= 1 \hspace{1cm}(2)\;\;\lim_{h\to0}\frac{\cos(h)-1}{h}=0\hspace{1cm}}}$$

Thus;

$$f'(x) = \cos(x)\cdot0 - \sin(x)\cdot1$$

$$\therefore {\color{red}\boxed{{\color{black}f'(x) = -\sin(x)}}}$$

$$\hline$$
* The blue box has been correct in this post to include the omitted value of the second limit and corrected such that the first limit occurs as $\theta\to0$ NOT $a\to0$

Last edited:
Cant wait for you to release the final version of this. Its nice how you took the steps to illustrate how you got the results.

But when you are deriving sine [and cosine]:

$${\color{blue}\boxed{\hspace{1cm}(1)\;\;\lim_{\thet a\to0}\frac{\sin\theta}{\theta}= 1 \hspace{1cm}(2)\;\;\lim_{h\to0}\frac{\cos(h)-1}{h}\hspace{1cm}}}$$

I wish you would indicate why this is so.

btw, the 4th line of latex text [for cosine] needs reformatting.

Last edited:
Umm speaking of Maple, I've gotten an account on the website, but where do I get the actual program..Its only got supplementary tutorials..I always had the impression it was one program that you open up and you could do calculations in...

O btw ranger he corrected the blue box in the post just before yours.

And, Have you seen $$\lim_{\x\to0}\sin x= x$$ before? I've seen multiple Geometric Proofs for this, which would answer you question. But, Though not actually proving it, It can be seen the tangent at sin x=0 has a gradient one 1, meaning the tangent has an equation y=x. Since, near the tangent, and therefore near zero, the solutions for sin x are the same, leading us to sin x= x for small x, in radians. Dividing both sides by x achieves what you want.

Last edited:
A correction in post number two, involving the derivation of the power rule for positive integers. The derivation is correct, however, there is a typo in the final result.

The following;

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;n^{ n} = nx^{n-1}\;\;\; n\in\mathbb{Z},\;n>0}}}$$

$$\therefore {\color{red}\boxed{{\color{black}\frac{d}{dx}\;x^{ n} = nx^{n-1}\;\;\; n\in\mathbb{Z},\;n>0}}}$$

My thanks to theperthvan for bringing it to my attention.

Nice tutorial

I thought I should let you know though, that unless I am more tired than I think right now, you have forgotten a factor right before the punchline in deducing the derivative of a product.

Ace1013 said:
Nice tutorial

I thought I should let you know though, that unless I am more tired than I think right now, you have forgotten a factor right before the punchline in deducing the derivative of a product.
No your not too tired, but I must have been asleep while typing it up, you are of course correct. The penultimate line of proof for the product rule should read;

$$=\lim_{h\to0}\left\{ u(x+h)\cdot\frac{v(x+h)-v(x)}{h}+v(x)\cdot\frac{u(x+h)-u(x)}{h}\right\} = \lim_{h\to0}\; u(x+h)\cdot \lim_{h\to0}\frac{v(x+h)-v(x)}{h}+\lim_{h\to0}\; v(x)\cdot \lim_{h\to0}\frac{u(x+h)-u(x)}{h}$$

Thank you Ace1013

$$\hline$$
CONTRIBUTION
$$\hline$$
Apologies to those who have been following this thread, as you may have noticed I have not posted here for a considerable period of time. Unfortunately, my commitments have prevented me adding more to this tutorial. I will still be posting here at PF (most days), but simply do not have the time to write extended posts. I would therefore like to open up this thread to contribution from others.

However, could I request that in order to maintain some form of logical order and integrity that those wishing to contribute to this thread contact me via PM before contributing.

Contributed by https://www.physicsforums.com/member.php?u=15685")

With respect to:

$$\hspace{1cm}(1)\;\;\lim_{\theta\to0}\frac{\sin \theta}{\theta}= 1 \hspace{1cm}(2)\;\;\lim_{h\to0}\frac{\cos(h)-1}{h}=0$$

Taylor Series
One can consider the Taylor Series expansion of sin(x) about x=0;

$$sin(x)\,=\,x\,-\,\frac{x^3}{3!}\,+\,\frac{x^5}{5!}\,-\,\cdots$$

then

$$\frac{sin(x)}{x}\,=\,1\,-\,\frac{x^2}{3!}\,+\,\frac{x^4}{5!}\,-\,\cdots$$

as x $\to$ 0, all terms powers of x $\to$ 0

$$cos(x)\,=\,1\,-\,\frac{x^2}{2!}\,+\,\frac{x^4}{4!}\,-\,\cdots$$

One can do (cos(x)-1)/x similarly.

Squeeze Theorem
Alternatively, if you are familiar with limits (in particular the squeeze theorem), it may be interesting to consider the following diagram;

http://www.davesbrain.ca"

Noting that;

area of triangle OAP < area of sector OAP< area of triangle OAT

or

$$\frac{1}{2}\sin(\theta) < \frac{1}{2}\theta < \frac{1}{2}\tan(\theta)$$

where $\theta$ is in radians. Next we divide through by $1/2 \sin(\theta)$;

$$1<\frac{\theta}{\sin(\theta)}<\frac{1}{\cos(\theta)}$$

which can be written as;

$$1<\frac{\sin(\theta)}{\theta}<\cos(\theta)$$

[note that $\sin(\theta)> 0$ for $\theta\in\left(0,\pi/2\right)$];

By the squeeze theorem, since $\lim_{\theta\to0^+}\cos(\theta) = 1$ then we have;

$$\lim_{\theta\to0^+}\frac{\sin(\theta)}{\theta} = 1$$

Since both $\sin(\theta)$ and $\theta$ are both odd functions, this implies that $(\sin\theta)/\theta$ is an even function and hence is symmetric about the y-axis. This implies that the limit from above (right) is equal to the limit from below(left); hence,

$$\lim_{\theta\to0^-}\frac{\sin(\theta)}{\theta} = 1 = \lim_{\theta\to0^-}\frac{\sin(\theta)}{\theta}$$

$$\lim_{\theta\to0}\frac{\sin(\theta)}{\theta} = 1$$

One can also obtain the same result from l'Hopital's rule.

My sincere thanks to both Astronuc and Dave for their contributions to this thread

Last edited by a moderator:
$${\color{grey}\boxed{\text{Contributed by Cristo and arildno}}}$$

Power Rule for Rational Powers
Suppose we have $x^{\frac{m}{n}},$ with $m,n \in \mathbb{Z}.$ In order to take the derivative of this function, we introduce the new variable $y=x^{\frac{m}{n}} \hspace{5mm}(1)$ and so we seek to find $\frac{dy}{dx}$. Let's raise each side of equation (1) to the power n, so that we are now dealing with only integer powers. This enables us to utilise the power rule for integer powers that we derived earlier. So, we have $y^n=(x^{\frac{m}{n}})^n=x^m \hspace{5mm}(2)$​

Now, we can take the derivative of the left hand side using the chain rule and the power rule for integer powers:​

$$\frac{d}{dx}y^n=ny^{n-1}\frac{dy}{dx}$$​

and the right hand side using just the power rule​

$$\frac{d}{dx}x^m=mx^{m-1}.$$​

Thus, equation (2) becomes​

$$ny^{n-1}\frac{dy}{dx}=mx^{m-1}$$​

$$\Rightarrow\frac{dy}{dx}=\frac{mx^{m-1}}{ny^{n-1}}$$​

We recall from above that $y=x^{\frac{m}{n}}$ and so we obtain

$$\frac{d}{dx}x^{\frac{m}{n}}=\frac{mx^{m-1}}{n(x^{\frac{m}{n}})^{n-1}}=\frac{mx^{m-1}}{nx^{m-\frac{m}{n}}}=\frac{m}{n}x^{m-1-(m-\frac{m}{n})}=\frac{m}{n}x^{\frac{m}{n}-1}$$

And so we have obtained the result that
$${\color{red}\boxed{{\color{black}\frac{d}{dx}x^{\frac{m}{n}}=\frac{m}{n}x^{\frac{m}{n}-1}}}}$$​

$$\hline$$

As a side note, there is no need to use the chain rule here.

Assuming continuity of the power functions, I'll show the case for positive unit fractions. The general case is easy to derive.

1. Let $f(x)=x^{\frac{1}{n}}, n\in\mathbb{N}[/tex] 2. We are to find, if possible, the limit: $$\lim_{h\to{0}}\frac{(x+h)^{\frac{1}{n}}-x^{\frac{1}{n}}}{h}$$ 3. We multiply the above fraction with: $$1=\frac{\sum_{i=0}^{n-1}((x+h)^{\frac{1}{n}})^{(n-1-i)}(x^{\frac{1}{n}})^{i}}{\sum_{i=0}^{n-1}((x+h)^{\frac{1}{n}})^{(n-1-i)}(x^{\frac{1}{n}})^{i}}$$ 4. This reduces the fraction to: $$\frac{((x+h)^{\frac{1}{n}})^{n}-(x^{\frac{1}{n}})^{n}}{h\sum_{i=0}^{n-1}((x+h)^{\frac{1}{n}})^{(n-1-i)}(x^{\frac{1}{n}})^{i}}=\frac{1}{\sum_{i=0}^{n-1}((x+h)^{\frac{1}{n}})^{(n-1-i)}(x^{\frac{1}{n}})^{i}}$$ 5. Since the power functions were assumed continuous, we see that as h tends to 0, the expression above tend to: $$\frac{1}{nx^{\frac{n-1}{n}}}=\frac{1}{n}x^{(\frac{1}{n}-1)}$$ as we should have. My sincere thanks to both cristo and arildno for their contributions to this thread Last edited: Drivative of arcsine I didn't read anything, here is something to add to the collection :) For a right triangle with hypotunuse (c) = 1 http://www.clarku.edu/~djoyce/trig/right.gif [I got this picture of the internet hopefully it is alway's available :)] $$sine(sine^{-1}(a))= a$$ We take dirivative of both sides $$cos(sin^{-1}(a)) \frac {d}{da} [sin^{-1}(a)] = 1$$ $$\frac {d}{da} [sin^{-1}(a)] = \frac {1} {cos(sin^{-1}(a))}$$ $$cos(sin^{-1}(a)) = b = \sqrt{1-a^2}$$ hence $$\frac {d}{da} [sin^{-1}(a)] = \frac {1} {\sqrt{1-a^2}}$$ Last edited by a moderator: Or perhaps more concisely, $$y = \arcsin x$$ $$x=\sin y$$ $$\frac{dx}{dy}=\cos y = \sqrt{1-\sin^2 y} = \sqrt{1-x^2}$$ $$\frac{dy}{dx}=\frac{1}{\sqrt{1-x^2}}$$ [Back to the derivatives of trigonometric functions] 3. Derivatives of Tangent Finding the derivative of [itex]f(x) = \tan(x)$ is a fairly trivial application of the quotient rule (or product rule if you prefer). However, I will still present the derivation in a fair amount of detail, if only to provide a further example of using the quotient rule.

From the definition of the tangent;

$$f(x):=\tan(x) = \frac{\sin(x)}{\cos(x)}$$

By the quotient rule we can write;

$$f^\prime(x) = \frac{\left[\cos(x)\right]\cdot\left[\sin(x)\right]^\prime - \left[\sin(x)\right]\cdot\left[\cos(x)\right]^\prime}{\cos^2(x)}$$

$$f^\prime(x) = \frac{\left[\cos(x)\right]\cdot\left[\cos(x)\right]- \left[\sin(x)\right]\cdot\left[-\sin(x)\right]^\prime}{\cos^2(x)}$$

$$f^\prime(x) = \frac{\cos^2(x) + \sin^2(x)}{\cos^2(x)} = \frac{1}{\cos^2(x)} = \sec^2(x)$$

Hence, we arrive at our result;

$${\color{red}\boxed{{\color{black}\frac{d}{dx}\tan(x)=\sec^2(x)}}}\qed$$

Example
Find

$$\frac{d}{dx}\sin\left(kx+\phi\right)$$

Where $k$ and $\phi$ are constants.​

Solution
This is a trivial example of the chain rule. Let $u(x) = kx+\phi$, then;

$$\frac{d}{dx}\sin\left(kx+\phi\right) = \frac{d}{du}\sin\left(u\right)\cdot\frac{d}{dx}u(x)$$

$$=\cos\left(kx+\phi\right)\cdot\frac{d}{dx}\left(kx+\phi\right)$$

Hence,

$$\frac{d}{dx}\sin\left(kx+\phi\right) = k\cos\left(kx+\phi\right)$$​

$$\hrule$$
$$\hrule$$
Just a quick reminder that this thread is still open to contribution from anyone and I would be more than happy to receive contributions and/or comments and suggestions for improvements/modifications.

However, could I request that in order to maintain some form of logical order and integrity that those wishing to contribute to this thread contact me via PM before contributing.

Thank you to those who have contributed already.

Last edited:
The Derivative of Inverse Functions
Suppose we have a function $f[/tex] such that [itex]f^{-1}\left(f(x)\right) = x\;\;\;, \forall x \in\mathbb{R}$. That is $f^{-1}(x)$ is the inverse of $f(x)$ for all real x. Note that the superscript '-1' is used here to denote the inverse of f and is not an exponent, $f^{-1}(x)\neq 1/f(x)$. How would we proceed if we wished to find then derivative of the inverse function?

From our statement we have,

$$f(x) \equiv f\left(f^{-1}(y)\right) = y$$

Now, assuming that $f$ if continuously differentiable, let us take the derivative with respect to y,

$$\frac{d}{dy}\left\{ \left(f^{-1}(y)\right) \right\} = \frac{d}{dy}y$$

Applying the chain rule to the LHS we obtain,

$$\frac{df}{d\left(f^{-1}\right)} \cdot \frac{d}{dy}\left\{ f^{-1}(y)\right \} = 1$$

If we assume that $df/d\left(f^{-1}\right) \neq 0$ then we can divide through by $df/d\left(f^{-1}\right)$,

$$\frac{d}{dy}\left\{ f^{-1}(y)\right \} = \frac{1}{df/d\left(f^{-1}\right)}$$

Given that $f^{-1}(y)\right = y \rightarrow f(y) = x[/tex]. Thus we arrive at our result, $${\color{red}\boxed{{\color{black}\frac{d}{dy}\left\{ f^{-1}(y)\right \} = \frac{1}{df/dx}}}}\qed$$ We will use this result later when discussing the derivatives of transcendental functions. Last edited: Contributed by Gib Z. Alternative Derivation of the Derivative of Inverse Functions From the previous post: Noting that since [itex]y= f(x)$, $f^{-1}(y) = x$.

Therefore, we can state the final result of the previous post in the following form:

$$\frac{dx}{dy} = \frac{1}{ \frac{dy}{dx}}$$

An alternative proof for this fact is from the definition of the derivative:

If $y=f(x)$ then,

$$\frac{dy}{dx} = \lim_{x\to x_1} \frac{y-y_1}{x-x_1}$$.

Note this definition is the same as the original one in the first post of this thread
by substituting $h =x-x_1$.

From this;

$$\frac{dy}{dx} = \lim_{x\to x_1} \frac{y-y_1)}{x-x_1}$$

$$= \lim_{x\to x_1} \frac{1}{ \frac{x-x_1}{y-y_1}}$$

$$= \frac{\lim_{x\to x_1} 1}{\lim_{x\to x_1} \frac{x-x_1}{y-y_1}}$$

$${\color{red}\boxed{{\color{black}\frac{d}{dy}\left \{ f^{-1}(y)\right \} = \frac{1}{\frac{dx}{dy}}}}}$$

As desired.

Last edited:
Inverse Trigonometric Derivatives (Part I of II)
Contributed by Gib Z. Edited by Hootenanny.

$$\frac{d}{dx} \sin x = \cos x$$

$$\frac{d}{dx} \cos x= -\sin x$$

$$\frac{d}{dx} \tan x = \sec^2 x$$

Using the results of the previous posts regarding the derivatives of inverse functions, we can find the derivatives of the inverse trigonometric functions fairly trivially. However, since the trigonometric functions are not one-to-one functions on the current domain of the real numbers, we must restrict the domain to a small interval, so that their inverse relations are well defined functions. There are many possible choices we could make, and would be suitable, but we choose specific ones for convenience. We call the inverse functions of $\sin x, x \in \left[-\pi , \pi\right]$; $\cos x, x \in \left[0,\pi\right]$ and $\tan x, x \in \left[-\frac{\pi}{2} , \frac{\pi}{2}\right]$; $\arcsin x$;$\arccos x$ and $\arctan x$ respectively.

A point to note here is that since the trigonometric functions have a restricted domain (and a restricted range), the inverse trigonometric functions will have a restricted range. A nice feature of of inverse functions it that the domain of a given function, becomes the range of the inverse function and vice-versa. For example, if we restrict $\cos x$ to a domain of $x \in \left[0,\pi\right]$, then the [restricted] range is, $\cos x \in \left[-1,1\right]$. Hence, the corresponding inverse function $\arccos x$ has domain $x \in \left[-1,1\right]$ and range $\arccos x \in \left[0,\pi\right]$.

Last edited:
Inverse Trigonometric Derivatives (Part II of II)
Contributed by Gib Z. Edited by Hootenanny.

Notation
A small note here on notation, there are two common notations for denoting inverse trigonometric functions.

$$\arcsin x = \sin^{-1} x$$

$$\arccos x = \cos^{-1} x$$

$$\arctan x = \tan^{-1} x$$

It should also be noted that the exponent does not denote the reciprocal, explicitly,

$$\sin^{-1} x \neq \frac{1}{\sin x} = \mathrm{cosec}\; x$$

$$\cos^{-1} x \neq \frac{1}{\cos x} = \sec x$$

$$\tan^{-1} x \neq \frac{1}{\sin x} = \cot x$$

If we now evaluate the derivatives of these [inverse] functions:

$$y := \arcsin x \Leftrightarrow x = \sin y \Rightarrow \frac{dx}{dy} = \cos y$$

$$\frac{dy}{dx} = \frac{1}{ \frac{dx}{dy}} = \frac{1}{\cos y} = \frac{1}{\sqrt{1-\sin^2 y}} = \frac{1}{\sqrt{1-x^2}}$$.

$$y := \arccos x \Leftrightarrow x = \cos y \Rightarrow \frac{dx}{dy} = -\sin y$$

$$\frac{dy}{dx} = \frac{1}{ \frac{dx}{dy}} = \frac{1}{-\sin y} = \frac{-1}{\sqrt{1-\cos^2 y}} = \frac{-1}{\sqrt{1-x^2}} = \frac{1}{\sqrt{x^2-1}}$$

$$y := \arctan x \Leftrightarrow x = \tan y \Rightarrow \frac{dx}{dy} = \sec^2 y$$

$$\frac{dy}{dx} = \frac{1}{ \frac{dx}{dy}} = \frac{1}{ \sec^2 y} = \frac{1}{ 1+ \tan^2 y} = \frac{1}{1+x^2}$$

Note: In these derivations the “Pythagorean Identities” were used several times, and can be derived by dividing each term in the identity $\sin^2 t + \cos^2 t = 1$ by $\sin^2 t \mbox{and} \cos^2 t$ respectively. In turn, that identity can be seen from the diagram in post #24.

$$\hrule$$
$$\hrule$$
Just a quick reminder that this thread is still open to contribution from anyone and I would be more than happy to receive contributions and/or comments and suggestions for improvements/modifications.

However, could I request that in order to maintain some form of logical order and integrity that those wishing to contribute to this thread contact me via PM before contributing.

Thank you to those who have contributed already

Differentiation of Series
In the previous section we discussed the derivatives if inverse functions, with the goal of discussing the derivatives of transcendental functions. However, before we can discuss transcendental functions, we must first examine the derivatives of series.
(1) Taylor Series
We have already met the Taylor series of some trigonometric functions in an early section. We shall now discuss "Taylor's Formula", which can be used to determine the [infinite] series representation for many functions. "Taylor's Theorem" defines the conditions for which a function must satisfy to be written as a Taylor series; however, it is not necessary to discuss the theorem here. We shall restrict ourselves to one-dimensional real-valued functions.
(1.1) Taylor's Formula
Given a function $f:I\to\mathbb{R}$, that is defined on some open interval $I$ and which has derivatives of all orders at a point $a\in I$, then one may represent $f(x)$ as,

$$f(x)=\sum_{n=1}^{\infty}\frac{f^{(n)}(a)}{n!}\left(x-a\right)^n = f(a)+f^\prime(a)\left(x-a\right)+\ldots$$​
(2) Term-wise Differentiation
Term-wise differentiation [and integration] is an extremely useful tool in analysis, which can be used to find the derivatives of functions which can be represented as [infinite] series. There is a theorem of advanced calculus which states that one can find the derivative of a [convergent] series by simply differentiating each term separately. We shall states this theorem here without proof.
(2.1) Theorem
Given an infinite series,

$$\sum_{n}^{\infty}f_n(x) = f_1(x)+f_2(x)+\ldots+f_n(x)+\ldots$$

that has continuous derivatives on some open interval $I$ and converges [uniformally] to $F(x)$, then if the series of differentiated terms,

$$\sum_{n}^{\infty}f^\prime_n(x) = f^\prime_1(x)+f^\prime_2(x)+\ldots+f^\prime_n(x)+\ldots$$

also converges uniformally on the interval $I$, then the original series can be differentiated term-wise such that,

$$\frac{d}{dx}F(x) = \sum_{n}^{\infty}f_n(x) = f^\prime_1(x)+f^\prime_2(x)+\ldots+f^\prime_n(x)+\ldots$$
Now we have all the necessary tools to examine the derivatives of transcendental functions, which is what we shall do in the following section.

Would it be possible to make a PDF version of all of this for refrence like you did for integration?

aggfx said:
Would it be possible to make a PDF version of all of this for refrence like you did for integration?
Welcome to PF aggfx,

If I have some time later in the year I may make a PDF of the final version once I've finished it. There are still a few sections that I would like to add, so there's not much point in creating a PDF at this stage.

By the way, kurdt is the author of the excellent Intro to Integration tutorial.

Last edited:

Replies
3
Views
1K
Replies
2
Views
914
Replies
2
Views
976
Replies
1
Views
1K
Replies
1
Views
980
Replies
4
Views
1K
Replies
8
Views
1K
Replies
5
Views
1K
Replies
17
Views
2K
Replies
5
Views
1K