Is Zero Outer Content Sufficient for Function Integrability?

STEMucator
Homework Helper
Messages
2,076
Reaction score
140

Homework Statement



1. Suppose that ##f = 0## at all points of a rectangle ##R## except on a set ##D## of outer content zero, where ##f \geq 0##. If ##f## is bounded, prove that ##f## is integrable on ##R## and ##\int \int f dA = 0##.

2. Now suppose ##f## is an integrable function on a rectangle ##R## and let ##g## be bounded on ##R##. If ##f = g## except for ##(x,y)## in some set of outer content zero where ##f ≤ g##, prove that ##g## is integrable on ##R## and ##\int \int f dA = \int \int g dA##.

Homework Equations





The Attempt at a Solution



1. Suppose ##f ≤ M## on ##R##. For all ##\epsilon > 0##, we can choose a partition ##P## such that the area enclosed by the partition satisfies:

$$A_P < \frac{\epsilon}{M}$$.

Then we have:

$$S_P = \sum_{i} M_i \Delta A_i ≤ M A_P < \epsilon$$

This is because ##M_i = 0## for rectangles which do not contribute to ##A_P##. Then ##\inf(S_P) < \epsilon \Rightarrow \inf(S_P) \leq 0##.

We know that ##\sup(s_p) \geq 0## and so ##0 \leq sup(s_p) \leq inf(S_P) \leq 0##. So obviously ##sup(s_p) = inf(S_P)##.

Hence it must be the case that ##f## is integrable since the upper and lower sums converge, and ##\int \int f dA = 0##.

I think that should be it for the first question hopefully.

2. Well we have ##g - f = 0## on ##R## except on a set of outer content zero where we have ##g - f \geq 0##. So by the first question, ##g - f## is integrable.

Hence ##g = (g - f) + f## is integrable by question 1. So we can finally write:

$$\int \int f dA = \int \int (g - f) dA + \int \int f dA = \int \int g dA$$

Need a sanity check to make sure this isn't wrong. Been a long while since I had to think about a proof that hard.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top