Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Question on uniform convergence

  1. Aug 19, 2007 #1

    Simfish

    User Avatar
    Gold Member

    5: Let p > 0. Let [tex]f_n (x) = x n^p e^{-nx}[/tex]

    (i) For what values of p is[tex] f_n (x) -> 0[/tex] uniformly on [0,1]?
    (ii) For what values of p is it true that [tex]lim_{n->\infty} \int_0^1 f_n (x) dx = 0[/tex]

    For (i), we know that uniform convergence implies that the convergence to 0 depends ONLY on the value of N, so it should converge to 0 for all x. Now, we can make our function arbitrarily small for any p, if p < 1,since then for n -> infinity, both functions converge to 0.

    Let's try (i) first.

    We want to find the maximum value of [tex] f_n (x)[/tex]. So...
    [tex]

    \begin{align}
    \left[ f_n (x) \right]' = \left[- x n^{p+1} e^{-nx} + n^p e^{-nx}\right] = 0 \\
    \left[ e^{nx} n^{p} (-nx + 1) \right] = 0 \\
    -nx + 1 = 0 \\
    nx = 1 \\
    x = 1/n,\\
    \end{align}

    [/tex]

    plug back into equation. the value of the maximum is... [tex]

    \left[ n^{p-1} * \frac{1}{e} \right]
    [/tex]
    which can only imply uniform convergence for all x for p [tex]<[/tex] 1.


    But what of (ii)? Integration by parts.

    [tex]

    x n^p \left[\frac{-x e^{-nx}}{n} - \frac{e^{-nx}}{n^2} \right]_{0}^{1} = \left[\frac{-e^{-n}}{n} - \frac{e^{-n}}{n^2} + \frac{1}{n^2} \right] x n^p

    [/tex]

    ...which would imply convergence of the integral for all p < 2. Am I wrong anywhere? Someone else said that p > 0, but this relation seems to be completely relevant for all negative values of p as well.
     
    Last edited: Aug 19, 2007
  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted