- #1
Eclair_de_XII
- 1,083
- 91
Homework Statement
"A set ##A\subset [0,1]## is dense in ##[0,1]## iff every open interval that intersects ##[0,1]## contains ##x\in A##. Suppose ##f:[0,1]\rightarrow ℝ## is integrable and ##f(x) = 0,x\in A## with ##A## dense in ##[0,1]##. Show that ##\int_{0}^{1}f(x)dx=0##."
Homework Equations
Let ##P=\{x_1,...,x_n\}## be a partition of ##[0,1]##
##M_i=sup\{f(x):x\in[x_{i-1},x_i]\}##
##m_i=inf\{f(x):x\in[x_{i-1},x_i]\}##
##U(P,f)=\sum_{i=1}^nM_i(f)(x_i-x_{i-1})##
##L(P,f)=\sum_{i=1}^nm_i(f)(x_i-x_{i-1})##
##\int_{0}^{1} fdx=sup\{L(P,f):P \space \text {a partition of} \space [0,1]\}=inf\{U(P,f):P \space \text {a partition of} \space [0,1]\}##
The Attempt at a Solution
I haven't actually attempted a solution yet; everything that follows is scratchwork...
Basically, I think I have to use the fact that ##A## being dense in ##[0,1]## gives the following implication:
##\int_{x\in A}f(x)dx=0 \space ⇒ \int_{0}^{1} f(x)dx=0##,
since ##f(x)=0,\forall x\in A##, then ##\int_{x\in A} f(x)dx=0##.
So my plan is to construct a series of open intervals that intersect ##[0,1]## as such:
##Q_i=(a_i,b_i)## where ##a_i=b_{i-1}##
Then, I define the initial open interval as ##Q_0=(a_0,b_0)##, with ##a_0\leq 0## and ##b_0 > 0## and the final open interval as ##Q_n=(a_n,b_n)## where ##b_n\geq 1##.
Then I construct a partition using those open intervals with ##(x_{i-1},x_i)=(a_i,b_i)##, with ##x_0=0## and ##x_n=1##. So I would construct a lower and upper sum, which by the conditions of the problem, must be equal to each other, and equal to zero (still going to need to figure this part out).
This is my current idea of how to handle the problem. Can anyone tell me if anything I have written is off? Thanks.