# Improper integral convergence proof

1. Apr 25, 2010

### Hoblitz

1. The problem statement, all variables and given/known data
Let $[a, b)$ be an interval in the reals, with $-\infty < a < b \leq \infty$, and let $\alpha: [a,b) \to \mathbb{R}$ be monotone increasing. Suppose that $f: [a,b) \to \mathbb{R}$ is a function such that for each $c \in (a,b)$, $f$ is integrable over $[a,c ]$ with respect to $\alpha$ (in particular f is bounded on every compact subinterval of $[a, b)$, although f may not be bounded on all of $[a, b)$).

Now for the actual problem....

Prove that if $F(x) = \int_a^x f(x) d\alpha$ is a bounded function $[a, b) \to \mathbb{R}$ and $g: [a, b) \to \mathbb{R}$ is non-negative valued, monotone decreasing, integrable with respect to $\alpha$ over each interval $[a, c]$ with a < c < b, and $\lim_{x \to b^{-}} g(x) = 0$, then

$$\int_a^b f(x)g(x)\, d\alpha$$

converges.

2. Relevant equations

3. The attempt at a solution
I've tried putting a uniform bound M on F, and finding $x_0$ such that $x_0 < x < b$ implies that $0 \leq g(x) < \frac{\epsilon}{M}$. However, I can't find a way to bring $g(x)$ "inside" the integral, so to speak. My intuition is that at worst $F$ is oscillating and that multiplying by g will bring it under control to make it convergent, sort of like $\int_1^\infty \frac{\sin x}{x} dx$ but I just can't get any ideas to work, and I'm really stuck.

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted