Can improper integrals converge without absolute convergence?

  • Context: MHB 
  • Thread starter Thread starter alyafey22
  • Start date Start date
  • Tags Tags
    Comparison Integrals
Click For Summary
SUMMARY

Improper integrals can converge without absolute convergence, as demonstrated by the example of the integral $$\int_{0}^{1} \frac{\sin \frac {1}{t}}{t}\ dt$$. The triangle inequality establishes that $$\big | \int^{b}_{a} f(t) \, dt \big | \leq \int^{b}_{a} |f(t)|\, dt$$ holds for Riemann integrable functions on the interval [a, b]. However, it is possible for the improper integral to converge while the integral of its absolute value diverges, indicating a nuanced relationship between convergence types.

PREREQUISITES
  • Understanding of Riemann integrability
  • Familiarity with improper integrals
  • Knowledge of the triangle inequality in analysis
  • Basic concepts of limits in calculus
NEXT STEPS
  • Study the properties of improper integrals in detail
  • Explore examples of functions that demonstrate convergence without absolute convergence
  • Learn about the implications of the triangle inequality in different contexts
  • Investigate the relationship between absolute convergence and conditional convergence in integrals
USEFUL FOR

Mathematicians, calculus students, and anyone studying real analysis who seeks to understand the convergence behavior of improper integrals.

alyafey22
Gold Member
MHB
Messages
1,556
Reaction score
2
I know we have the following

$$ \big | \int^{b}_{a} f(t) \, dt \big | \leq \int^{b}_{a} |f(t)|\, dt$$

1- How to prove the inequality ,what are the conditions ?
2- Does it work for improper integrals ?
 
Physics news on Phys.org
ZaidAlyafey said:
I know we have the following

$$ \big | \int^{b}_{a} f(t) \, dt \big | \leq \int^{b}_{a} |f(t)|\, dt$$

1- How to prove the inequality ,what are the conditions ?

If f(*) is Riemann integrable on (a,b) then the integral is... $$\int_{a}^{b} f(t)\ dt = \lim_{n \rightarrow \infty\ , \text{max} \Delta t_{i} \rightarrow 0} \sum_{i=0}^{n-1} f(t_{i})\ \Delta t_{i}\ (1)$$

For any finite sum is...

$$ |\sum_{i=0}^{n-1} a_{i}| \le \sum_{i=0}^{n-1} |a_{i}|\ (2)$$

... and that proves the item 1...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
$$ |\sum_{i=0}^{n-1} a_{i}| \le \sum_{i=0}^{n-1} |a_{i}|\ (2)$$

That is the triangle inequality on the elements of the sequence , right ?
 
ZaidAlyafey said:
That is the triangle inequality on the elements of the sequence , right ?

The so called 'triangle inequality' holds in general for vectors or complex numbers and extablishes that...

$$ |\sum_{i=1}^{n} a_{i}| \le \sum_{i=1}^{n} |a_{i}|\ (1) $$

See here...

Triangle Inequality -- from Wolfram MathWorld

Kind regards

$\chi$ $\sigma$
 
And since the metric $d(x,y)=|x-y|$ is uniformly continuous, you are justified in passing to the limit for the integral.
 
Here's how I remember the outline of the proof.

If $f(x)$ and $g(x)$ are integrable on $[a,b]$ (by which I mean Riemann integrable) and $f(x) \le g(x)$ for all $x \in [a,b]$, then $\int_{a}^{b} f(x) \ dx \le \int_{a}^{b} g(x) \ dx$.

Now if $f(x)$ is integrable on $[a,b]$, so is $|f(x)|$.

And $-|f(x)| \le f(x) \le |f(x)|$ for all $ x \in [a,b]$.

So $- \int_{a}^{b} |f(x)| \ dx \le \int_{a}^{b} f(x) \ dx \le \int_{a}^{b} |f(x)| \ dx $.
 
ZaidAlyafey said:
I know we have the following

$$ \big | \int^{b}_{a} f(t) \, dt \big | \leq \int^{b}_{a} |f(t)|\, dt$$

1- How to prove the inequality ,what are the conditions ?
2- Does it work for improper integrals ?

The answer to point 2 is slighly more complex. If we consider an improper integral in (a,b) where a is a singularity of f(*), then we intend...

$$\int_{a}^{b} f(t)\ dt = \lim_{x \rightarrow a+} \int_{x}^{b} f(t)\ dt\ (1)$$

The problem in such a case is that in can be that $\int_{a}^{b} f(t)\ dt$ converges and $\int_{a}^{b} |f(t)|\ dt$ diverges. An interesting example of such a case is...

$$\int_{0}^{1} \frac{\sin \frac {1}{t}}{t}\ dt\ (2)$$

Kind regards

$\chi$ $\sigma$
 
chisigma said:
The answer to point 2 is slighly more complex. If we consider an improper integral in (a,b) where a is a singularity of f(*), then we intend...

$$\int_{a}^{b} f(t)\ dt = \lim_{x \rightarrow a+} \int_{x}^{b} f(t)\ dt\ (1)$$

The problem in such a case is that in can be that $\int_{a}^{b} f(t)\ dt$ converges and $\int_{a}^{b} |f(t)|\ dt$ diverges. An interesting example of such a case is...

$$\int_{0}^{1} \frac{\sin \frac {1}{t}}{t}\ dt\ (2)$$

Kind regards

$\chi$ $\sigma$

This is just like absolute convergence in series . If the integral absolutely convergent then it is convergent . If it is ''absolutely divergent" then the integral may or may not converge.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 29 ·
Replies
29
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K