Theorem: Suppose f is bounded on [a,b], f has only finitely many points of discontinuity on [a,b], and α is continuous at every point at which f is discontinuous. Then f is in R(α). Proof: Let ε>0 be given. Put M=sup|f(x)|, let E be the set of points at which f is discontinuous. Since E is finite and α is continuous at every point of E, we can cover E by finitely many disjoint intervals [uj, vj] in [a,b] such that the sum of the corresponding differences α(vj)-α(uj) is less than ε. Why this is true? I understand that we can cover E by those finitely many disjoint intervals, but I don't understand why we could cover E in such a way the sum of the corresponding differences α(vj)-α(uj) would be less than epsilon. Any help would be appreciated.