- #1
- 1,591
- 3
The following is an outgrowth from a problem I encountered in the Homework section concerning the convergence of:
[tex]\int_0^{\pi/2} ln[sin(x)]dx[/tex]
I feel there is some way of separating the region around an asymptote, say x=b, into a region which yields a convergent integral and a region which does not. My initial suspicions is that the boundary is marked by the functions:
[tex]g1(x)=\frac{1}{x-b}\quad\text{and}\quad g2(x)=\frac{1}{b-x}[/tex]
This is shown in red in the attached plot along with functions in blue which have a convergent improper integral and one in green which does not.
I'm considering the following proposition:
Let f(x) be bounded on the interval [a,b) with:
[tex]\mathop\lim\limits_{x\to b} f(x)=\infty[/tex]
Then:
[tex]\int_a^b f(x)dx[/tex]
converges iff the following is true:
[tex]\exists x_c \in [a,b) : |f(x)|<\frac{1}{|x-b|} \quad\forall x\in [x_c,b)[/tex]
That is, if f(x) ever strays onto or outside the region bounded by g1 and g2 and remains there as x goes to b, then the integral will necessarilly diverge. If it remains interior to the region, it converges.
My initial thoughs are the integrals:
[tex]\int_0^1 \frac{1}{x^k}dx[/tex]
which converge if k<1 and diverge otherwise.
Can such a general statement by made for functions in general no matter how complex?
I'll attempt to prove it false by finding a counter-example but I've found none so far.
What do you guys think?
[tex]\int_0^{\pi/2} ln[sin(x)]dx[/tex]
I feel there is some way of separating the region around an asymptote, say x=b, into a region which yields a convergent integral and a region which does not. My initial suspicions is that the boundary is marked by the functions:
[tex]g1(x)=\frac{1}{x-b}\quad\text{and}\quad g2(x)=\frac{1}{b-x}[/tex]
This is shown in red in the attached plot along with functions in blue which have a convergent improper integral and one in green which does not.
I'm considering the following proposition:
Let f(x) be bounded on the interval [a,b) with:
[tex]\mathop\lim\limits_{x\to b} f(x)=\infty[/tex]
Then:
[tex]\int_a^b f(x)dx[/tex]
converges iff the following is true:
[tex]\exists x_c \in [a,b) : |f(x)|<\frac{1}{|x-b|} \quad\forall x\in [x_c,b)[/tex]
That is, if f(x) ever strays onto or outside the region bounded by g1 and g2 and remains there as x goes to b, then the integral will necessarilly diverge. If it remains interior to the region, it converges.
My initial thoughs are the integrals:
[tex]\int_0^1 \frac{1}{x^k}dx[/tex]
which converge if k<1 and diverge otherwise.
Can such a general statement by made for functions in general no matter how complex?
I'll attempt to prove it false by finding a counter-example but I've found none so far.
What do you guys think?
Attachments
Last edited: