My prof has inflicted us with his infinite wisdom in that calculators are stupid because they can only do a certain series of things they're programed to do. Based upon that principle, he's also inflicted the following problem upon us, of which my groupmates and I have no idea how to even get started. Any help would be appreciated. Thanks! Now, teach your calculator to divide. We do this by teaching it to take 1/x for all x not =0. It is easy to teach it certain fractions: 1/10=0.1, 1/100=0.01 and in general 1/(10^n)=.000 .... .01 where you have n-1 zeros behind the decimal point. A) Assuming your calculator knows what 1/(10^n) is, find the Taylor Series to help your calculator approximate 1/x for any x not=0. Justify convergence. B)Set up an inequality that will help your calculator approximate 1/a to the nearest ten thousandth. Find the number of terms necessary for a=53 and a=122.