- #1

- 101

- 0

## Main Question or Discussion Point

Theorem: Suppose f is bounded on [a,b], f has only finitely many points of discontinuity on [a,b], and α is continuous at every point at which f is discontinuous. Then f is in R(α).

Proof: Let ε>0 be given. Put M=sup|f(x)|, let E be the set of points at which f is discontinuous.

Why this is true? I understand that we can cover E by those finitely many disjoint intervals, but I don't understand why we could cover E in such a way the sum of the corresponding differences α(v

Any help would be appreciated.

Proof: Let ε>0 be given. Put M=sup|f(x)|, let E be the set of points at which f is discontinuous.

__Since E is finite and α is continuous at every point of E, we can cover E by finitely many disjoint intervals [u___{j}, v_{j}] in [a,b] such that the sum of the corresponding differences α(v_{j})-α(u_{j}) is less than ε.Why this is true? I understand that we can cover E by those finitely many disjoint intervals, but I don't understand why we could cover E in such a way the sum of the corresponding differences α(v

_{j})-α(u_{j}) would be less than epsilon.Any help would be appreciated.