Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Integration of O() terms of the Taylor series

  1. Feb 2, 2012 #1

    I have two functions say f1(β) and f2(β) as follows:

    f1(β)=1/(aδ^2) + 1/(bδ) + O(1) ... (1)


    f2(β)= c+dδ+O(δ^2) ... (2)

    where δ = β-η and a,b,c,d and η are constants. Eq. (1) and (2) are the Taylor series expansions of f1(β) and f2(β) about η respectively. I need to integrate f1(β) and f2(β) with respect to β (-1,1). Integration is straight forward for all the terms except O(1) and O(δ^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help.
  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?