- 1,590
- 3
The following is an outgrowth from a problem I encountered in the Homework section concerning the convergence of:
\int_0^{\pi/2} ln[sin(x)]dx
I feel there is some way of separating the region around an asymptote, say x=b, into a region which yields a convergent integral and a region which does not. My initial suspicions is that the boundary is marked by the functions:
g1(x)=\frac{1}{x-b}\quad\text{and}\quad g2(x)=\frac{1}{b-x}
This is shown in red in the attached plot along with functions in blue which have a convergent improper integral and one in green which does not.
I'm considering the following proposition:
Let f(x) be bounded on the interval [a,b) with:
\mathop\lim\limits_{x\to b} f(x)=\infty
Then:
\int_a^b f(x)dx
converges iff the following is true:
\exists x_c \in [a,b) : |f(x)|<\frac{1}{|x-b|} \quad\forall x\in [x_c,b)
That is, if f(x) ever strays onto or outside the region bounded by g1 and g2 and remains there as x goes to b, then the integral will necessarilly diverge. If it remains interior to the region, it converges.
My initial thoughs are the integrals:
\int_0^1 \frac{1}{x^k}dx
which converge if k<1 and diverge otherwise.
Can such a general statement by made for functions in general no matter how complex?
I'll attempt to prove it false by finding a counter-example but I've found none so far.
What do you guys think?
\int_0^{\pi/2} ln[sin(x)]dx
I feel there is some way of separating the region around an asymptote, say x=b, into a region which yields a convergent integral and a region which does not. My initial suspicions is that the boundary is marked by the functions:
g1(x)=\frac{1}{x-b}\quad\text{and}\quad g2(x)=\frac{1}{b-x}
This is shown in red in the attached plot along with functions in blue which have a convergent improper integral and one in green which does not.
I'm considering the following proposition:
Let f(x) be bounded on the interval [a,b) with:
\mathop\lim\limits_{x\to b} f(x)=\infty
Then:
\int_a^b f(x)dx
converges iff the following is true:
\exists x_c \in [a,b) : |f(x)|<\frac{1}{|x-b|} \quad\forall x\in [x_c,b)
That is, if f(x) ever strays onto or outside the region bounded by g1 and g2 and remains there as x goes to b, then the integral will necessarilly diverge. If it remains interior to the region, it converges.
My initial thoughs are the integrals:
\int_0^1 \frac{1}{x^k}dx
which converge if k<1 and diverge otherwise.
Can such a general statement by made for functions in general no matter how complex?
I'll attempt to prove it false by finding a counter-example but I've found none so far.
What do you guys think?
Attachments
Last edited: