- #1

- 57

- 21

- Homework Statement
- Taylor's theorem states that for any reasonable function ##f(x)##, the value of f at a point ##(x + \delta)## can be expressed as an infinite series involving f and its derivatives at the point x:

$$f(x+\delta) = f(x) + f'(x)\delta +\frac{1}{2!}f''(x)\delta^2+\frac{1}{3!}f'''(x)\delta^3$$ Find the Taylor series for ##ln(1+\delta)##

- Relevant Equations
- See above

I'm just trying to understand how this works, because what I've been looking at online seems to indicate that I evaluate at ##\delta =0## for some reason, but that would make the given equation for the Taylor series wrong since every derivative term is multiplied by some power of ##\delta##. Could someone walk me through at least the first derivative term?