To show that an integral is divergent

  • Context: Graduate 
  • Thread starter Thread starter longrob
  • Start date Start date
  • Tags Tags
    Divergent Integral
Click For Summary
SUMMARY

The discussion centers on demonstrating the divergence of the integral of the Cauchy distribution, specifically the integral -∞ (x / (x² + a²)) dx. The user proposes using the antiderivative ln(a² + x²) to conclude divergence, but is informed that this approach is insufficient. The correct method involves evaluating the limits of integration and applying Taylor Series expansion, leading to the conclusion that the integral diverges as x approaches infinity.

PREREQUISITES
  • Understanding of Cauchy distribution properties
  • Knowledge of definite integrals and limits
  • Familiarity with Taylor Series expansion
  • Basic concepts of increasing functions and their implications in calculus
NEXT STEPS
  • Study the properties of the Cauchy distribution and its implications in statistics
  • Learn about evaluating improper integrals and their convergence criteria
  • Explore Taylor Series and their applications in calculus
  • Investigate the behavior of increasing functions in the context of integration
USEFUL FOR

Mathematicians, statisticians, and students studying calculus or probability theory, particularly those interested in the properties of the Cauchy distribution and integral divergence.

longrob
Messages
51
Reaction score
0
Hi all

I am looking for a simple way to show that the mean of the Cauchy distribution us undefined. This is because this integral diverges:
\underset{-\infty}{\overset{\infty}{\int}}\frac{x}{x^{2}+a^{2}}dx​
Now, I know one proof which replaces the limits of integration with -x1 and x2. After carrying our the definite integration we are left with \frac{1}{2}\ln\left(\frac{a^{2}+x_{2}^{2}}{a^{2}+x_{1}^{2}}\right) and finally (by Taylor Series expansion) 2\ln x_{2}-2\ln x_{1}+smaller terms . Then allowing x1 and x2 to approach infinity shows that the intergral diverges.

My question is now: is it sufficient, on any level, just to look at the antiderivative \ln(a^{2}+x^{2}), state that it is an increasing function of x, and simply conclude from it that the integral diverges ?

Thanks
LR
 
Physics news on Phys.org
The antiderivative increasing is definitely not enough. If you're integrating ANY positive function you'll get an increasing antiderivative, but there are plenty of such functions which are integrable
 
Thanks for your reply. That's what i thought. So I assume the easiest/typical way is the way I described above ?
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 17 ·
Replies
17
Views
5K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K