Differentiation and Lebesgue integration

AxiomOfChoice
Messages
531
Reaction score
1

Homework Statement


Suppose g(x) = \int_0^x f(t) dt, where f is Lebesgue integrable on \mathbb R. Give an \epsilon - \delta proof that g'(y) = f(y) if y\in (0,\infty) is a point of continuity of f.


Homework Equations





The Attempt at a Solution


I know I need to show that
<br /> f(y) = \lim_{h\to 0} \int_y^{y+h} \frac{1}{h} f(t) dt.<br />
My idea was to try to do this in terms of sequences; i.e., to let \{h_n\} be any sequence of real numbers such that h_n \to 0, and then to phrase the limit above in terms of a limit as n\to \infty. I had then planned to use something like the dominated convergence theorem. But I don't have any idea how to make use of the hypothesis that f is continuous at y, so I'm not sure if this is the right approach.
 
Physics news on Phys.org
could you look at it as follows:
f(y) = \lim_{h\to 0} \frac{1}{h} \int_y^{y+h} f(t) dt

then use the conitinuity of f to show that as h gets small
\int_y^{y+h} f(t) dt \approx f(t).h
 
Ok. I think I know how to do this. Here is a complete solution: Let f \in L^1(\mathbb R). We want to show that, given any \epsilon &gt; 0, we can find a \delta &gt; 0 such that
<br /> \left| \frac{1}{h} \int_y^{y+h} f - f(y) \right| &lt; \epsilon<br />
whenever |h| &lt; \delta. Since f is continuous at y, we know we can find a \delta such that |t - y| &lt; \delta implies |f(t) - f(y)| &lt; \epsilon. So, if we make |h| &lt; \delta, we have
<br /> \begin{align*}<br /> \left| \frac{1}{h} \int_y^{y+h} f - f(y) \frac{1}{h} \int_y^{y+h} f \right|<br /> &amp; \leq \frac{1}{h} \int_y^{y+h} \left| f(t) - f(y) \right| dt\\<br /> &amp; \leq \frac{1}{h} \int_y^{y+h} \epsilon dt = \epsilon.<br /> \end{align*}<br />

That does it, I think.
 
There is a typo, you should write:
<br /> \left| \frac{1}{h} \int_y^{y+h} f - f(y) \right| =\Bigg|\frac{1}{h}\int_{y}^{y+h}f-f(y)\int_{y}^{y+h}\frac{1}{h}\Bigg|<br />
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top