In a book of mine, the author proceeds to the proof that a Riemann sum in a interval [a,b] must converge by proving that for S_m and S_n (m>n) where the span of the subdivisions is suffiencienly small, then(adsbygoogle = window.adsbygoogle || []).push({});

|S_m - S_n)| < e(b-a)

Where e can assume infinitly small values in dependance of the span.

Now I understand why S_m has to be bounded, however I do not see an argument strong enough for convergeance - couldn't S_m assume constantly changing lower or higher values within a certain interval? That certainly could satisfy the inequality above... What am I missing?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Convergeance of Riemann sum

**Physics Forums | Science Articles, Homework Help, Discussion**