1. The problem statement, all variables and given/known data For g=Hf = sin (f), use a Taylor expansion to determine the range of input for which the operator is approximately linear within 10 % 2. Relevant equations The taylor series from 0 to 1 , the linearization, is the most appropriate equation 3. The attempt at a solution g(f) = sin(0) + f*cos(0) = f g1(f) = sin(f) g2 (f) =f ( at f=0, g1=g2(f) ) g1(f) = g2(f) + error sin(f) = f + (1/10) * sin (f) (9/10)* sin (f) =f the value I keep getting is when f is equal to 0. I really don't think I am doing this correctly. Any advice?