Moving limits in and out of functions

  • Thread starter Thread starter Only a Mirage
  • Start date Start date
  • Tags Tags
    Functions Limits
Only a Mirage
Messages
57
Reaction score
0
When is the following equivalence valid?

$$\lim_{x \to a} f(g(x)) = f(\lim_{x \to a} g(x))$$

I was told that continuity of f is key here, but I'm not positive.

This question comes up, for instance in one proof showing the equivalence of the limit definition of the number e to the definition of the inverse of the natural logarithm.
 
Physics news on Phys.org
Only a Mirage said:
I was told that continuity of f is key here, but I'm not positive.
That's the key. Try comparing ##\lim_{x \to 0} f(g(x))## against ##f(\lim_{x \to 0} g(x))## with ##g(x)=x## and ##f(x)=0\,\forall\,x\ne 0,\, f(0)=1##.
 
Thanks for the specific example.

Can you prove (or point me to the proof) of the general case?
 
That's one very reasonable definition of continuity.
 
Only a Mirage said:
Thanks for the specific example.

Can you prove (or point me to the proof) of the general case?

If you are using the \epsilon - \delta definition of continuity, then the idea is that eventually x_n will be within to x and so f(x_n) will be withi in \epsilon of f(x). But this is exactly what it means for f(x_n) to converge to f(x).
 
Robert1986 said:
If you are using the \epsilon - \delta definition of continuity, then the idea is that eventually x_n will be within to x and so f(x_n) will be withi in \epsilon of f(x). But this is exactly what it means for f(x_n) to converge to f(x).

What exactly do you mean by the sequence x_n here?
 
economicsnerd said:
That's one very reasonable definition of continuity.

Interesting. But how would you show that this definition is equivalent to, for example, the epsilon-delta definition?
 
If \lim_{x\to a} g(x) exists, and f(x) is continuous, then the statement is true. If \lim_{x\to a} g(x) does not exist, then the right hand side does not make sense as written, so the statement cannot be true, and if f(x) is not continuous then the statement is not true by DH's example.

As for showing that \lim_{x\to a} g(x) = L implies that \lim_{x\to a} f(g(x)) = L when L is continuous, you should just slam it with epsilons and deltas until it works - I don't think there's a particularly clever trick
 
Only a Mirage said:
Interesting. But how would you show that this definition is equivalent to, for example, the epsilon-delta definition?
Suppose that ##f## is continuous at ##x## in the epsilon-delta sense. Let ##\epsilon > 0##. Then there is a ##\delta > 0## such that ##|f(y) - f(x)| < \epsilon## for all ##y## satisfying ##|y - x| < \delta##. Let ##(x_n)## be a sequence converging to ##x##. Then there is an ##N## such that ##|x_n - x| < \delta## for all ##n > N##. Thus for all ##n > N## we have ##|f(x_n) - f(x)| < \epsilon##. We can do this for any ##\epsilon > 0##, so this means that ##f(x_n) \rightarrow f(x)##.

Conversely, suppose that ##f(x_n) \rightarrow f(x)## for any sequence ##(x_n)## such that ##x_n \rightarrow x##. Let ##\epsilon > 0##. We claim that there is a ##\delta > 0## such that ##|f(y) - f(x)| < \epsilon## whenever ##|y - x| < \delta##. Suppose this were not the case. Then it must be true that for every ##\delta > 0##, there is some ##y## satisfying ##|y - x| < \delta## but ##|f(y) - f(x)| \geq \epsilon##. Let ##(\delta_n)## be any sequence of positive numbers converging to zero. Then we can find a sequence ##(x_n)## satisfying ##|x_n - x| < \delta_n## and ##|f(x_n) - f(x)| \geq \epsilon##. The conditions ##|x_n - x| < \delta_n## and ##\delta_n \rightarrow 0## imply that ##x_n \rightarrow x##, so our hypothesis implies that ##f(x_n) \rightarrow f(x)##. But this contradicts ##|f(x_n) - f(x)| \geq \epsilon##.
 
  • #10
jbunniii said:
Suppose that ##f## is continuous at ##x## in the epsilon-delta sense. Let ##\epsilon > 0##. Then there is a ##\delta > 0## such that ##|f(y) - f(x)| < \epsilon## for all ##y## satisfying ##|y - x| < \delta##. Let ##(x_n)## be a sequence converging to ##x##. Then there is an ##N## such that ##|x_n - x| < \delta## for all ##n > N##. Thus for all ##n > N## we have ##|f(x_n) - f(x)| < \epsilon##. We can do this for any ##\epsilon > 0##, so this means that ##f(x_n) \rightarrow f(x)##.

Conversely, suppose that ##f(x_n) \rightarrow f(x)## for any sequence ##(x_n)## such that ##x_n \rightarrow x##. Let ##\epsilon > 0##. We claim that there is a ##\delta > 0## such that ##|f(y) - f(x)| < \epsilon## whenever ##|y - x| < \delta##. Suppose this were not the case. Then it must be true that for every ##\delta > 0##, there is some ##y## satisfying ##|y - x| < \delta## but ##|f(y) - f(x)| \geq \epsilon##. Let ##(\delta_n)## be any sequence of positive numbers converging to zero. Then we can find a sequence ##(x_n)## satisfying ##|x_n - x| < \delta_n## and ##|f(x_n) - f(x)| \geq \epsilon##. The conditions ##|x_n - x| < \delta_n## and ##\delta_n \rightarrow 0## imply that ##x_n \rightarrow x##, so our hypothesis implies that ##f(x_n) \rightarrow f(x)##. But this contradicts ##|f(x_n) - f(x)| \geq \epsilon##.

Ahh... Thank you for the detailed proof! I was able to use the ideas from your proof to prove the result in my original post (basically the exact same proof, but mine involved a function ##g(x)##, whereas yours involved the sequence ##\{x_n\}##)

Anyway, thanks again! And thanks to everyone else :)
 
Back
Top