Hello,(adsbygoogle = window.adsbygoogle || []).push({});

I have two functions say f1(β) and f2(β) as follows:

f1(β)=1/(aδ^2) + 1/(bδ) + O(1) ... (1)

and

f2(β)= c+dδ+O(δ^2) ... (2)

where δ = β-η and a,b,c,d and η are constants. Eq. (1) and (2) are the Taylor series expansions of f1(β) and f2(β) about η respectively. I need to integrate f1(β) and f2(β) with respect to β (-1,1). Integration is straight forward for all the terms except O(1) and O(δ^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help.

Regards,

N.

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Integration of O() terms of the Taylor series

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

Loading...

Similar Threads for Integration terms Taylor |
---|

I How to derive this log related integration formula? |

I An integration Solution |

B I Feel Weird Using Integral Tables |

B Methods of integration: direct and indirect substitution |

A Getting a finite result from a non-converging integral |

**Physics Forums | Science Articles, Homework Help, Discussion**