# Approximations for e^x, sin and cos

1. Aug 14, 2004

### hedlund

I had a question on a math test which said that you should find an approximation for $$e^x$$ which is very good for $$x \approx 0$$. First I declared the function $$f(x) = e^x$$. We have the intresting thing that $$f(x) = f'(x) = f''(x) = f'''(x) \ldots \ \forall x$$. And because of this we have $$f(0) = f'(0) = f''(0) = f'''(0) \ldots$$. Next I defined the function $$g(x) = ax^3+bx^2+cx+d$$ and if we want $$g(x) \approx f(x)$$ for $$x \approx 0$$. Using this it leads to that $$d=1$$, $$c=1$$, $$b = 1/2$$, $$a=1/6$$. So $$g(x) = x^3/6 + x^2/2 + x + 1$$. This formula is good for $$x \approx 0$$. So I tried with $$h(x) = ax^4+bx^3+cx^2+dx+e$$ which leads to $$h(x) = x^4/24 + x^3/6 + x^2/2+x+1$$ which is better. I found a pattern, we have for an $$j$$ degree polynom the formulas $$\sum_{u=0}^{j} \frac{ x^u}{u!}$$. But graphing $$e^x$$ and a polynom of $$j$$ degree we get better and better result when $$j \to \infty$$. So on my test I wrote done that a good approximation for $$e^x$$ for $$x \approx 0$$ would be $$\sum_{u=0}^{\infty} \frac{x^u}{u!}$$. I've just started calculus and that stuff, so I don't know if this is the answer my teacher wanted. I only know of factioral and sums because I got to study discreet math instead of psychology. Using the same technique as I used for finding an approximation for $$e^x$$ I gave formulas for $$\cos{x}$$ and $$\sin{x}$$. The formulas are $$\sin{x} \approx \sum_{u=0}^{\infty} \frac{ \left( - 1 \right)^u \cdot x^{2u+1}}{ \left( 2u+1 \right)!}$$ and $$\cos{x} \approx \frac{\left(-1 \right)^u \cdot x^{2u}}{ \left( 2u \right)!}$$. Are these formulas used for anything? And most important, are they correct - when I graph them they seem to be correct.

2. Aug 14, 2004

### HallsofIvy

What you have are the "Taylor series" for ex, sin(x), and cos(x).
If you got those without knowing the general formula to start with, I'm impressed!

3. Aug 14, 2004

### hedlund

I only used that the derivate of e^x is e^x, the derivate of sin(x) is cos(x) and that the derivate of cos(x) is -sin(x). I used equation systems to get approximations for third and fourth degree polynoms and thought I saw a pattern - and plotted and it seemed to mix. This "Taylor series", are they of any use to anyone and are my formulas correct?

4. Aug 14, 2004

### arildno

They are supremely important, and yes, the formulas (the infinite series) are correct for ANY choice of x (not only small).

Last edited: Aug 14, 2004
5. Aug 14, 2004

### JonF

You can use the series for e^x to approximate things like the integral of e^-x^2.

The series of sin and cos were used to make the trig tables before calculators.

6. Aug 17, 2004

### hedlund

I had math today, and after the lesson was over my teacher told me to stay in class. She said that she had started correcting the tests, and that she was impressed with the formulas. So I guess that is good ...

7. Aug 17, 2004

### Dburghoff

I must say that I'm impressed with your doing that without knowing about Taylor series. Basically, a Taylor series is an approximation of a function using infinite polynomials. They can be centered around various numbers, and are of the following form, where a is the center of the number:

$$f(x) = f(a) + f'(a)(x-a) + \frac{f''(a)(x-a)^2}{2!} + \frac{f'''(a)(x-a)^3}{3!}+...+\frac{f^{(n)}(a)(x-a)^n}{n!}+...$$

Using zero as a center, as you did, is a special case also known as a Maclaurin series. Though not all Taylor series converge for all values, the ones for e^x, sin x, and cos x do. In fact, you can easily prove De Movire's Theorem using Taylor series.

8. Aug 18, 2004

### hedlund

What is De Moivre's Theorem? Never heard of it, maybe I've worked with it but not knowing that it was De Moivre's Theorem ...

9. Aug 18, 2004

### Muzza

de Moivre's theorem: (cos(x) + isin(x))^n = cos(nx) + isin(nx). It can also be proved easily using Euler's formula...

10. Aug 19, 2004

### Dburghoff

Which is most easily proven with Taylor series.

Last edited: Aug 19, 2004
11. Aug 20, 2004

### HallsofIvy

Actually, (cos(x)+ i sin(x))n= cos(nx)+ i sin(nx) can be proved directly without using Taylor series or Euler's formula:

(cos(x)+ i sin(x))2= cos2(x)+ 2i sin(x)cos(x)- sin2(x)= (cos2(x)- sin[sup[2][/sup](x))+ i (2sin(x)(cos(x))
which is equal to cos(2x)+ i sin(2x) by trig identities. The general formula can be done by induction.

12. Aug 20, 2004

### HallsofIvy

Actually, (cos(x)+ i sin(x))n= cos(nx)+ i sin(nx) can be proved directly without using Taylor series or Euler's formula:

(cos(x)+ i sin(x))2= cos2(x)+ 2i sin(x)cos(x)- sin2(x)= (cos2(x)- sin2(x))+ i (2sin(x)(cos(x))
which is equal to cos(2x)+ i sin(2x) by trig identities. The general formula can be done by induction.

13. Aug 20, 2004

### cronxeh

14. Aug 20, 2004

### Dburghoff

That only proves De Moivre's for natural numbers, though. I believe (and correct me if I'm wrong) that you must use Taylor series / Euler's formula to prove it for all reals.

Last edited: Aug 20, 2004
15. Aug 25, 2004

### gvk

Can you reveal a bit how you get coefficients of expansion? It is most important part of your post, but it is not quite clear.
Thanks

16. Aug 31, 2004

### hedlund

Sure. I will show the method for the approximate h(x) = ax^3 + bx^2 + cx + d, we have f(x) = e^x, and f'(x) = e^x, f''(x) = e^x and so on. We want h(x) = f(x), h'(x) = f'(x), h''(x) = f''(x) and so on. First we conclude that f(0) = 1, h(0) must be equal to 1, this gives d=1. So h(x) = ax^3+bx^2+cx+1. Next we have h'(x) = 3*ax^2+2*bx+c+1 and h'(0) should be equal to 1, this gives c=1. h(x) = ax^3+bx^2+x+1. Next we have h''(x) = 3*2*ax+2b, so h''(0) = 1 gives b=1/2, so h(x) = ax^3+x^2/2+x+1. Next we ge h'''(x) = 3*2*a, so h'''(0) = 1 => a = 1/6 so h(x) = x^3/6 + x^2/2 + x + 1. I tried for fourth degree polynom on my test and saw that j(x) = x^4/24 + x^3/6 + x^2/2 + x + 1. So it's obvius that if we want v(x) be a function that is very good as approximate for e^x. It's obivus that $$v(x) = \sum_{n=0}^{j} x^n \cdot s$$ where s should be dependent on n. So I saw the series 1, 1, 2, 6, 24 which is n! for n=0,1,2,3,4. So I concluded that $$v(x) = \lim_{j \to \infty} \sum_{n=0}^{j} \frac{x^n}{n!}$$. Hope that explains it, if my texts is a mess it's because I've worked for 14 hours

Last edited: Aug 31, 2004