Integration of O() terms of the Taylor series

nawidgc
Messages
24
Reaction score
0
Hello,

I have two functions say f1(β) and f2(β) as follows:

f1(β)=1/(aδ^2) + 1/(bδ) + O(1) ... (1)

and

f2(β)= c+dδ+O(δ^2) ... (2)



where δ = β-η and a,b,c,d and η are constants. Eq. (1) and (2) are the Taylor series expansions of f1(β) and f2(β) about η respectively. I need to integrate f1(β) and f2(β) with respect to β (-1,1). Integration is straight forward for all the terms except O(1) and O(δ^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help.
Regards,
N.
 
Physics news on Phys.org
The basic problem is, that ##O(\cdot)## is unspecified, i.e. we do not know its exact value. I do not see how a definite integral would make sense here, as any value can result from it. For a generic behavior we have:

##O(1) = C_1## is simply a constant, so there is no problem with it.
##O(\delta^2)=O(\beta^2) \leq C_2\beta^2## yields only an upper bound, so integration will result in something less than ##C_3\cdot \beta^3##
 
Back
Top