1. The problem statement, all variables and given/known data Evaluate the Integral [tex]\int[/tex] dx/((a^2+x^2)*sqrt(1-x^2)) from -1 to 1 Using contour integration 2. Relevant equations Residue theorem/Cauchy integral forumula 3. The attempt at a solution So I know that at the end-points of the interval (abs(z) = 1) that a singularity exists, so a branch-cut from -1 to 1 needs to be made. Additionally, the singularities in the contour used for the residue theorem are located at z = +ia and -ia. After this I am confused. I can't find any resources that describe how to evaluate a contour integral where the endpoints of the interval used to define the contour has branch-point singularities. Any ideas? I am thinking maybe the contour that I described ( from just above the real-axis counter-clockwise to real 1 and then up into the imaginary plane heading counter-clockwise towards -1 on the real axis enclosing the singularity z= +ia) Any thoughts?