Approximations for e^x, sin and cos

  • Thread starter Thread starter hedlund
  • Start date Start date
  • Tags Tags
    Cos E^x Sin
hedlund
Messages
34
Reaction score
0
I had a question on a math test which said that you should find an approximation for e^x which is very good for x \approx 0. First I declared the function f(x) = e^x. We have the interesting thing that f(x) = f'(x) = f''(x) = f'''(x) \ldots \ \forall x. And because of this we have f(0) = f'(0) = f''(0) = f'''(0) \ldots. Next I defined the function g(x) = ax^3+bx^2+cx+d and if we want g(x) \approx f(x) for x \approx 0. Using this it leads to that d=1, c=1, b = 1/2, a=1/6. So g(x) = x^3/6 + x^2/2 + x + 1. This formula is good for x \approx 0. So I tried with h(x) = ax^4+bx^3+cx^2+dx+e which leads to h(x) = x^4/24 + x^3/6 + x^2/2+x+1 which is better. I found a pattern, we have for an j degree polynom the formulas \sum_{u=0}^{j} \frac{ x^u}{u!}. But graphing e^x and a polynom of j degree we get better and better result when j \to \infty. So on my test I wrote done that a good approximation for e^x for x \approx 0 would be \sum_{u=0}^{\infty} \frac{x^u}{u!}. I've just started calculus and that stuff, so I don't know if this is the answer my teacher wanted. I only know of factioral and sums because I got to study discreet math instead of psychology. Using the same technique as I used for finding an approximation for e^x I gave formulas for \cos{x} and \sin{x}. The formulas are \sin{x} \approx \sum_{u=0}^{\infty} \frac{ \left( - 1 \right)^u \cdot x^{2u+1}}{ \left( 2u+1 \right)!} and \cos{x} \approx \frac{\left(-1 \right)^u \cdot x^{2u}}{ \left( 2u \right)!}. Are these formulas used for anything? And most important, are they correct - when I graph them they seem to be correct.
 
Physics news on Phys.org
What you have are the "Taylor series" for ex, sin(x), and cos(x).
If you got those without knowing the general formula to start with, I'm impressed!
 
HallsofIvy said:
What you have are the "Taylor series" for ex, sin(x), and cos(x).
If you got those without knowing the general formula to start with, I'm impressed!

I only used that the derivate of e^x is e^x, the derivate of sin(x) is cos(x) and that the derivate of cos(x) is -sin(x). I used equation systems to get approximations for third and fourth degree polynoms and thought I saw a pattern - and plotted and it seemed to mix. This "Taylor series", are they of any use to anyone and are my formulas correct?
 
They are supremely important, and yes, the formulas (the infinite series) are correct for ANY choice of x (not only small).
 
Last edited:
You can use the series for e^x to approximate things like the integral of e^-x^2.

The series of sin and cos were used to make the trig tables before calculators.
 
I had math today, and after the lesson was over my teacher told me to stay in class. She said that she had started correcting the tests, and that she was impressed with the formulas. So I guess that is good ... :smile:
 
I must say that I'm impressed with your doing that without knowing about Taylor series. Basically, a Taylor series is an approximation of a function using infinite polynomials. They can be centered around various numbers, and are of the following form, where a is the center of the number:

f(x) = f(a) + f'(a)(x-a) + \frac{f''(a)(x-a)^2}{2!} + \frac{f'''(a)(x-a)^3}{3!}+...+\frac{f^{(n)}(a)(x-a)^n}{n!}+...

Using zero as a center, as you did, is a special case also known as a Maclaurin series. Though not all Taylor series converge for all values, the ones for e^x, sin x, and cos x do. In fact, you can easily prove De Movire's Theorem using Taylor series.
 
Dburghoff said:
I must say that I'm impressed with your doing that without knowing about Taylor series. Basically, a Taylor series is an approximation of a function using infinite polynomials. They can be centered around various numbers, and are of the following form, where a is the center of the number:

f(x) = f(a) + f'(a)(x-a) + \frac{f''(a)(x-a)^2}{2!} + \frac{f'''(a)(x-a)^3}{3!}+...+\frac{f^{(n)}(a)(x-a)^n}{n!}+...

Using zero as a center, as you did, is a special case also known as a Maclaurin series. Though not all Taylor series converge for all values, the ones for e^x, sin x, and cos x do. In fact, you can easily prove De Movire's Theorem using Taylor series.

What is De Moivre's Theorem? Never heard of it, maybe I've worked with it but not knowing that it was De Moivre's Theorem ...
 
de Moivre's theorem: (cos(x) + isin(x))^n = cos(nx) + isin(nx). It can also be proved easily using Euler's formula...
 
  • #10
Muzza said:
de Moivre's theorem: (cos(x) + isin(x))^n = cos(nx) + isin(nx). It can also be proved easily using Euler's formula...
Which is most easily proven with Taylor series. :wink:
 
Last edited:
  • #11
Actually, (cos(x)+ i sin(x))n= cos(nx)+ i sin(nx) can be proved directly without using Taylor series or Euler's formula:

(cos(x)+ i sin(x))2= cos2(x)+ 2i sin(x)cos(x)- sin2(x)= (cos2(x)- sin[sup[2][/sup](x))+ i (2sin(x)(cos(x))
which is equal to cos(2x)+ i sin(2x) by trig identities. The general formula can be done by induction.
 
  • #12
Actually, (cos(x)+ i sin(x))n= cos(nx)+ i sin(nx) can be proved directly without using Taylor series or Euler's formula:

(cos(x)+ i sin(x))2= cos2(x)+ 2i sin(x)cos(x)- sin2(x)= (cos2(x)- sin2(x))+ i (2sin(x)(cos(x))
which is equal to cos(2x)+ i sin(2x) by trig identities. The general formula can be done by induction.
 
  • #14
HallsofIvy said:
Actually, (cos(x)+ i sin(x))n= cos(nx)+ i sin(nx) can be proved directly without using Taylor series or Euler's formula:

(cos(x)+ i sin(x))2= cos2(x)+ 2i sin(x)cos(x)- sin2(x)= (cos2(x)- sin2(x))+ i (2sin(x)(cos(x))
which is equal to cos(2x)+ i sin(2x) by trig identities. The general formula can be done by induction.
That only proves De Moivre's for natural numbers, though. I believe (and correct me if I'm wrong) that you must use Taylor series / Euler's formula to prove it for all reals.
 
Last edited:
  • #15
hedlund said:
I had a question on a math test which said that you should find an approximation for e^x which is very good for x \approx 0. First I declared the function f(x) = e^x. We have the interesting thing that f(x) = f'(x) = f''(x) = f'''(x) \ldots \ \forall x. And because of this we have f(0) = f'(0) = f''(0) = f'''(0) \ldots. Next I defined the function g(x) = ax^3+bx^2+cx+d and if we want g(x) \approx f(x) for x \approx 0. Using this it leads to that d=1, c=1, b = 1/2, a=1/6. So g(x) = x^3/6 + x^2/2 + x + 1. This formula is good for x \approx 0. So I tried with h(x) = ax^4+bx^3+cx^2+dx+e which leads to h(x) = x^4/24 + x^3/6 + x^2/2+x+1 which is better. I found a pattern, we have for an j degree polynom the formulas \sum_{u=0}^{j} \frac{ x^u}{u!}. But graphing e^x and a polynom of j degree we get better and better result when j \to \infty. So on my test I wrote done that a good approximation for e^x for x \approx 0 would be \sum_{u=0}^{\infty} \frac{x^u}{u!}. I've just started calculus and that stuff, so I don't know if this is the answer my teacher wanted. I only know of factioral and sums because I got to study discreet math instead of psychology. Using the same technique as I used for finding an approximation for e^x I gave formulas for \cos{x} and \sin{x}. The formulas are \sin{x} \approx \sum_{u=0}^{\infty} \frac{ \left( - 1 \right)^u \cdot x^{2u+1}}{ \left( 2u+1 \right)!} and \cos{x} \approx \frac{\left(-1 \right)^u \cdot x^{2u}}{ \left( 2u \right)!}. Are these formulas used for anything? And most important, are they correct - when I graph them they seem to be correct.

Can you reveal a bit how you get coefficients of expansion? It is most important part of your post, but it is not quite clear.
Thanks
 
  • #16
Sure. I will show the method for the approximate h(x) = ax^3 + bx^2 + cx + d, we have f(x) = e^x, and f'(x) = e^x, f''(x) = e^x and so on. We want h(x) = f(x), h'(x) = f'(x), h''(x) = f''(x) and so on. First we conclude that f(0) = 1, h(0) must be equal to 1, this gives d=1. So h(x) = ax^3+bx^2+cx+1. Next we have h'(x) = 3*ax^2+2*bx+c+1 and h'(0) should be equal to 1, this gives c=1. h(x) = ax^3+bx^2+x+1. Next we have h''(x) = 3*2*ax+2b, so h''(0) = 1 gives b=1/2, so h(x) = ax^3+x^2/2+x+1. Next we ge h'''(x) = 3*2*a, so h'''(0) = 1 => a = 1/6 so h(x) = x^3/6 + x^2/2 + x + 1. I tried for fourth degree polynom on my test and saw that j(x) = x^4/24 + x^3/6 + x^2/2 + x + 1. So it's obvius that if we want v(x) be a function that is very good as approximate for e^x. It's obivus that v(x) = \sum_{n=0}^{j} x^n \cdot s where s should be dependent on n. So I saw the series 1, 1, 2, 6, 24 which is n! for n=0,1,2,3,4. So I concluded that v(x) = \lim_{j \to \infty} \sum_{n=0}^{j} \frac{x^n}{n!}. Hope that explains it, if my texts is a mess it's because I've worked for 14 hours
 
Last edited:
Back
Top