- #1

Volt

- 10

- 0

## Homework Statement

Use residue theory to establish the result:

[tex]\int^{\pi}_{0}[/tex][tex]\frac{dx}{A + Bcosx}[/tex] = [tex]\frac{\pi}{\sqrt{A^2 - B^2}}[/tex]

## The Attempt at a Solution

So I've gotten to the point that the above integral =

[tex]\frac{1}{2}[/tex] [tex]\oint^{2\pi}_{0}[/tex] [tex]\frac{-2i}{Bz^{2} + 2Az + B}[/tex] dz

which I know is correct. The problem I'm having is that the problem doesn't state any restrictions on A and B, but obviously the result I'm asked to prove can't hold for any A and B, but I'm just not sure what the restrictions are.

For example, can't A and B be chosen such that there are no poles inside the unit circle, and therefore the integral would just equal 0? Or that you have only one pole in the circle, or two poles, etc.

I factored the denominator as [tex]\frac{-A \pm \sqrt{A^{2} - B^{2}}}{B}[/tex] but I have no idea what to do from here. I could now rewrite the denominator of the integral using the two factors but without knowing the values of A and B I don't know how I can possibly continue (i.e. determine if there are even any poles inside the unit circle).