# Approximation with derivative

Hi,

In my textbook they derive that a solution to the law of Faraday and the law of Ampère-Maxwell is an electromagnetic wave.

In one of the steps they have to calculate $$E(x+dx,t)$$ where E is the magnitude of the electric field of the wave. They say
$$E(x+dx,t) \approx E(x,t)+\frac{dE}{dx} \Big|_{t=constant}dx$$
On what theorem is this step based?

Thank you.

Gib Z
Homework Helper
The fact that a function can be approximated well at point a by the tangent at point a.

So basically you do a linear approximation? In my textbook they write that it's an approximation but they use it later as if it's an exact equality. Is this wrong or can you do it because dx is infinitesimally small?

That's generally known as a first order Taylor expansion. But it can be obtained from the definition of the derivative:

$$\frac{dE}{dx} = \lim_{dx \rightarrow 0}\frac{E(x+dx)-E(x)}{dx}$$

Forget about the limit and solve for E(x+dx). As an approximation, the result works as long as dx is small. The reason is you can expand any function in a Taylor series, which is a power series in dx. Terms above dx^2 get cut in an expansion like that, so the error for small, non-zero dx is roughly proportionate to dx^2

tiny-tim
Homework Helper
In my textbook they write that it's an approximation but they use it later as if it's an exact equality. Is this wrong or can you do it because dx is infinitesimally small?

Hi yoran!

Yes … if dx is only very small, then it's only an approximation.

But if dx is infinitesimally small, then it's exact!

Hurkyl
Staff Emeritus