1. The problem statement, all variables and given/known data I have to give the range of validity for a Taylor series built from an expression of the form: (1+(a/b)x)^c 2. Relevant equations 3. The attempt at a solution Obviously the validity does not extend to x=-(b/a) on the negative side, but should I then state that it is valid for: -(b/a) < x < (b/a) The reason being that all Standard Taylor series I've seen seem to have a symmetric interval. But I can't see why this approximation shouldn't be valid for all real numbers with the exception of -(b/a). Any help greatly appreciated.