# Integration of O() terms of the Taylor series

1. Feb 2, 2012

### nawidgc

Hello,

I have two functions say f1(β) and f2(β) as follows:

f1(β)=1/(aδ^2) + 1/(bδ) + O(1) ... (1)

and

f2(β)= c+dδ+O(δ^2) ... (2)

where δ = β-η and a,b,c,d and η are constants. Eq. (1) and (2) are the Taylor series expansions of f1(β) and f2(β) about η respectively. I need to integrate f1(β) and f2(β) with respect to β (-1,1). Integration is straight forward for all the terms except O(1) and O(δ^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help.
Regards,
N.