Dell said:
how do i integrate 1/x 1/(1+x^2) dx to prove that p=2 diverges, i tried integration in parts but doesn't really help. how did you do it?
Partial fractions:
1/x 1/(1+x^2) = A/x + (C + D x)/(1+x^2)
C = 0 because the left hand side is an odd function:
1/x 1/(1+x^2) = A/x + (C + D x)/(1+x^2)
The A/x term can be found by multiplying both sides by x and taking the limit x to zero. This gives A = 1. We don't need to compute D, we just need that:
1/x 1/(1+x^2) = 1/x + term proportional to x/(1+x^2)
The last term can be integrated from zero to 1, it does not diverge. The fist term integrated from epsilon to 1 yileds a Log(epsilon) term and the limit to epsilon to zero does not exist, so the integral diverges.
Note that it is essential in this argument to consider both terms and to show that the second term does converge. The reason is that singularites can sometimes cancel. E.g. the integral of x from zero to 1 clearly exists. But, if some mathematical manipulation leads to the x being written as:
1/x + (x - 1/x)
and you were to say tat this is not convergent because the first term diverges when integrated from zero to 1, then that's clearly wrong. The second term also diverges and, in this case, the sum of the two does converge.