To show that an integral is divergent

longrob
Messages
51
Reaction score
0
Hi all

I am looking for a simple way to show that the mean of the Cauchy distribution us undefined. This is because this integral diverges:
\underset{-\infty}{\overset{\infty}{\int}}\frac{x}{x^{2}+a^{2}}dx​
Now, I know one proof which replaces the limits of integration with -x1 and x2. After carrying our the definite integration we are left with \frac{1}{2}\ln\left(\frac{a^{2}+x_{2}^{2}}{a^{2}+x_{1}^{2}}\right) and finally (by Taylor Series expansion) 2\ln x_{2}-2\ln x_{1}+smaller terms . Then allowing x1 and x2 to approach infinity shows that the intergral diverges.

My question is now: is it sufficient, on any level, just to look at the antiderivative \ln(a^{2}+x^{2}), state that it is an increasing function of x, and simply conclude from it that the integral diverges ?

Thanks
LR
 
Physics news on Phys.org
The antiderivative increasing is definitely not enough. If you're integrating ANY positive function you'll get an increasing antiderivative, but there are plenty of such functions which are integrable
 
Thanks for your reply. That's what i thought. So I assume the easiest/typical way is the way I described above ?
 
Back
Top