Write a user-defined function that determines cos(x) using Taylor Series expansion
Stop adding terms when estimated error, E<=.000001
sum Sn = Sn-1 + an
E = | (Sn - Sn-1)/Sn-1 |
The Attempt at a Solution
function y = cosTaylor(x)
while E >= .000001
This gives values too large compared to what they should be.
I really don't understand why this doesn't work.
Any help is appreciated.
Thanks in advance!