i understand the reason for if the evaluated integral converges then the series also converges. However, just because the evaluated integral diverges, why does this automatically mean that the related series also diverges?(adsbygoogle = window.adsbygoogle || []).push({});

the integral consists of every number of the +x axis when evaluating f(x), the function's convergence or divergence to be determined. so if the sum of the f(x) at every single x point/interval on the +x axis is evaluated to converge, then this obviously means that the series would converge because the series consists of much less f(x) values to be summed up, even though the sequences for both the integral and the series function follow the same trend, ie converge to the same value.

However.. for the integral test, when there exists a function f(x) whose evaluated integral turns out to diverge, why does this automatically mean that the series (whose sum consists of much less f(x) values..) also diverges?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Calculus II please explain integral test

Loading...

**Physics Forums | Science Articles, Homework Help, Discussion**