# Convergence of a sequence of integrals

1. Feb 24, 2012

### Locoism

1. The problem statement, all variables and given/known data

Let I=[a,b], f : I to R be continuous and suppose that f(x) >= 0 . If M = sup{f(x):x ε I} show that the sequence $$\left( \int_a^b (f(x))^n \, dx \right)^\frac{1}{n}$$
converges to M

3. The attempt at a solution

Where do I start? I'm thinking of having $$g_n(x)= \left( \int_a^b (f(x))^n \, dx \right)^\frac{1}{n}$$ and showing that converges to a function g(x) (uniformly?) but that just feels like restating the problem.

If I can show that there exists $x_o$ such that $|f(x_o)-M| < \frac{ε}{2}$ , and by continuity if $|x-x_o| < δ$ then $|f(x)-f(x_o)| < \frac{ε}{2}$ and then triangle inequality it up to show $|f(x)-M| < ε$

I still feel this gets me nowhere. Any ideas?

2. Feb 24, 2012

### kai_sikorski

Well you can trivially bound it from above, so just work on bounding it from below. Can you find an interval over which f(x)>M(1-ε), if so ∫f(x)ndx>δ Mn(1-ε)n. What happens to this bound when you raise it to 1/n?