- 4,662
- 372
let f(x) be continuously differentiable in [0,infinity) such that the derivative f'(x) is bounded. suppose that the integral \int_{a}^{\infty}|f(x)|dx converges. prove that f(x)->0 when x->infinity.
now, here's what i did:
f'(x) is bounded then for every x>=0 |f'(x)|<=M for some M>0.
now bacause the integral \int_{a}^{\infty}|f(x)|dx converges for some a>0, then by cauchy criterion we have that for every e>0 there exists B such that for every b1>b2>B>0 \int_{b2}^{b1}|f(x)|dx<e now we can use the inequality: |\int_{b2}^{b1}f(x)dx|<=\int_{b2}^{b1}|f(x)|dx<e.
now, the crucial point here is that i need to show that for every e>0 there exists M'>0 such that for every x>M' |f(x)|<e.
now i think i can use the mean value theorem for integral here, so there exists a point x in (b2,b1) such that \int_{b2}^{b1}f(t)dt=f(x)(b1-b2) and then we have that for every x greater than B
|f(x)|(b1-b2)<e so |f(x)|<e/(b1-b2) but this doesn't work cause b1 and b2 aren't constants, so perhaps here i should use that the derivative is bounded, so by lagrange theorem we have that: |f(b1)-f(b2)|<=M(b1-b2) but still i don't see how to connect both of these theorems to prove this statement.
any hints?
now, here's what i did:
f'(x) is bounded then for every x>=0 |f'(x)|<=M for some M>0.
now bacause the integral \int_{a}^{\infty}|f(x)|dx converges for some a>0, then by cauchy criterion we have that for every e>0 there exists B such that for every b1>b2>B>0 \int_{b2}^{b1}|f(x)|dx<e now we can use the inequality: |\int_{b2}^{b1}f(x)dx|<=\int_{b2}^{b1}|f(x)|dx<e.
now, the crucial point here is that i need to show that for every e>0 there exists M'>0 such that for every x>M' |f(x)|<e.
now i think i can use the mean value theorem for integral here, so there exists a point x in (b2,b1) such that \int_{b2}^{b1}f(t)dt=f(x)(b1-b2) and then we have that for every x greater than B
|f(x)|(b1-b2)<e so |f(x)|<e/(b1-b2) but this doesn't work cause b1 and b2 aren't constants, so perhaps here i should use that the derivative is bounded, so by lagrange theorem we have that: |f(b1)-f(b2)|<=M(b1-b2) but still i don't see how to connect both of these theorems to prove this statement.
any hints?