(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Let I=[a,b], f : I to R be continuous and suppose that f(x) >= 0 . If M = sup{f(x):x ε I} show that the sequence $$\left( \int_a^b (f(x))^n \, dx \right)^\frac{1}{n}$$

converges to M

3. The attempt at a solution

Where do I start? I'm thinking of having [tex] g_n(x)= \left( \int_a^b (f(x))^n \, dx \right)^\frac{1}{n}[/tex] and showing that converges to a function g(x) (uniformly?) but that just feels like restating the problem.

If I can show that there exists [itex]x_o[/itex] such that [itex]|f(x_o)-M| < \frac{ε}{2}[/itex] , and by continuity if [itex]|x-x_o| < δ[/itex] then [itex]|f(x)-f(x_o)| < \frac{ε}{2}

[/itex] and then triangle inequality it up to show [itex]|f(x)-M| < ε[/itex]

I still feel this gets me nowhere. Any ideas?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Convergence of a sequence of integrals

**Physics Forums | Science Articles, Homework Help, Discussion**