I came across a problem in Calculus by Spivak, and I'm having trouble formalizing the proof.(adsbygoogle = window.adsbygoogle || []).push({});

Let [tex]A_{n}[/tex] bet a set of finite numbers in [0,1], and if [tex]m \neq n[/tex] then [tex]A_{n}[/tex] and [tex]A_{m}[/tex] are disjoint. Let f(x) be defined as

f(x)=1/n if x is in [tex]A_{n}[/tex] and f(x)=0 if x is not in any [tex]A_{n}[/tex]. The question asks to prove the limit as x goes to a of f is 0 for any a in [0,1].

Now I thought: given an n, there are only finitely many elements of [tex]A_{n}[/tex] in a neighborhood of a. Choose the smallest such n, say [tex]n_{0}[/tex]. Then [tex]f(x)\leq1/n_{0}[/tex]. Restrict the neighborhood further so that none of the elements of [tex]A_{n_{0}}[/tex] are in the interval. Then choose the next n such that it's minimal. Obviously [tex]f(x)\leq1/n\leq1/n_{0}[/tex]. Successively doing this, for arbitrarily small x, f(x) will tend to 0.

I don't know how to prove this using the limit definition; can someone help me out with what [tex]\delta[/tex] to choose?

**Physics Forums - The Fusion of Science and Community**

# Limit proof

Have something to add?

- Similar discussions for: Limit proof

Loading...

**Physics Forums - The Fusion of Science and Community**