Is the Function g Continuous on [a,b]?

  • Context: Graduate 
  • Thread starter Thread starter e(ho0n3
  • Start date Start date
  • Tags Tags
    Extension Function
Click For Summary

Discussion Overview

The discussion revolves around the continuity of a function g defined in terms of another function f that is continuous on the interval (a,b]. Participants explore the conditions under which g is continuous on the closed interval [a,b], particularly focusing on the behavior of sequences converging to the point a.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested, Mathematical reasoning

Main Points Raised

  • Some participants propose that g is continuous at points s in [a,b] where s is not equal to a, using sequences to demonstrate convergence.
  • Others argue that the continuity of g at a depends on the existence of the limit of f as x approaches a, which must be defined for g(a).
  • There is a discussion about how to handle sequences that contain the point a, with some suggesting that if a appears infinitely often, it can be analyzed through subsequences.
  • One participant questions whether the convergence of g(x_n) to g(a) holds true for sequences in [a,b], leading to further clarification about subsequences and their behavior.
  • Another participant points out a potential misunderstanding regarding the notation and the convergence of sequences, emphasizing the need to clarify the definitions used.

Areas of Agreement / Disagreement

Participants express varying views on the continuity of g at the point a, with some asserting that the limit must exist for continuity to hold, while others explore the implications of sequences containing the point a. The discussion remains unresolved regarding the specific conditions under which g is continuous at a.

Contextual Notes

Limitations include assumptions about the existence of limits and the handling of sequences that may or may not contain the point a, which are not fully resolved in the discussion.

e(ho0n3
Messages
1,349
Reaction score
0
Suppose we have a function f from R to R that is continuous on (a,b]. Define g by g(x) = f(x) if x <> a and g(a) = lim f(x) as x approaches a. Is it true that g is continuous on [a,b]?

I would think it is, but I'm having a hard time proving it. I'm trying to use sequences to do this: Suppose {s_n} is a sequence in [a,b] that converges to s.

If s <> a, then then {s_n} contains only finitely many a's, whence there exists an M such that s_n <> a for all n > M. Thus, the sequence {s_m} where m > M contains no a's, so {g(s_m) = f(s_m)} converges to f(s) = g(s) by the continuity of f, i.e. for any e > 0, there is an N such that |g(s_m) - g(s)| < e for all n > N. Thus, |g(s_n) - g(s)| < e for all n > N > M, so g is continuous at s <> a.

Now suppose s = a. If {s_n} contains finitely many a's, then I can use the same strategy as mentioned above to conclude that {g(s_n)} converges to a. If {s_n} contains infinitely many a's, then surely there's a subsequence {s_m} of {s_n} such that {g(s_m)} converges g(a). How can I extend this so that {g(s_n)} converges to g(a)?
 
Physics news on Phys.org
e(ho0n3 said:
Suppose we have a function f from R to R that is continuous on (a,b]. Define g by g(x) = f(x) if x <> a and g(a) = lim f(x) as x approaches a. Is it true that g is continuous on [a,b]?
Assuming that limit exists, then g(x) will be continuous from the right at x= a. Of course, you can't say anything about what happens when x< a.

I would think it is, but I'm having a hard time proving it. I'm trying to use sequences to do this: Suppose {s_n} is a sequence in [a,b] that converges to s.

If s <> a, then then {s_n} contains only finitely many a's, whence there exists an M such that s_n <> a for all n > M. Thus, the sequence {s_m} where m > M contains no a's, so {g(s_m) = f(s_m)} converges to f(s) = g(s) by the continuity of f, i.e. for any e > 0, there is an N such that |g(s_m) - g(s)| < e for all n > N. Thus, |g(s_n) - g(s)| < e for all n > N > M, so g is continuous at s <> a.

Now suppose s = a. If {s_n} contains finitely many a's, then I can use the same strategy as mentioned above to conclude that {g(s_n)} converges to a. If {s_n} contains infinitely many a's, then surely there's a subsequence {s_m} of {s_n} such that {g(s_m)} converges g(a). How can I extend this so that {g(s_n)} converges to g(a)?
Again, you never used the fact that, in order to define g(a) to be \lim_{x\rightarrow a} f(x) that limit must exist. If {xn} is a sequence of numbers, in (a,b], converging to a, then \lim_{n\rightarrow \infty}g(x_n)= \lim_{\rightarrow\infty}f(x_n) and, by definition, that last limit is g(a). That's all you need. I have no idea why you are even worrying about "how many 'a's" such a sequence must have.
 
HallsofIvy said:
Again, you never used the fact that, in order to define g(a) to be \lim_{x\rightarrow a} f(x) that limit must exist.
Oops. I forgot to state that. Let's assume the limit exists.

If {xn} is a sequence of numbers, in (a,b], converging to a, then \lim_{n\rightarrow \infty}g(x_n)= \lim_{\rightarrow\infty}f(x_n) and, by definition, that last limit is g(a).
But what happens if the sequence is in [a,b]? I want to state that if {x_n} is a sequence in [a,b], then {g(x_n)} converges to g(a). Is this not true?
 
e(ho0n3 said:
Oops. I forgot to state that. Let's assume the limit exists.


But what happens if the sequence is in [a,b]? I want to state that if {x_n} is a sequence in [a,b], then {g(x_n)} converges to g(a). Is this not true?

Just after I wrote that, it finally dawned on me that you were considering the case that the sequence contains a itself. No problem. If a only occurs a finite number of times, then there is no difficulty- you can discard them. If a appears an infinite number of times, look at the two subsequences, the constant sequence of all g(a)'s and the sequence with the a's discarded. Obviously the first sequence converges to g(a) and then you can use the fact that, for the other sequence g(xn)= g(xn) and argue as I said.
 
HallsofIvy said:
If a appears an infinite number of times, look at the two subsequences, the constant sequence of all g(a)'s and the sequence with the a's discarded. Obviously the first sequence converges to g(a) and then you can use the fact that, for the other sequence g(xn)= g(xn) and argue as I said.

I think you meant to write g(xn)= f(xn).

But what if the sequence with the a's discarded isn't a infinite sequence anymore? I guess I would argue as follows: If all subsequences of {x_n} have an infinite amount of a's, then {g(x_n)} converges to g(a). If there is a subsequence of {x_n} with a finite amount of a's, then I would argue as you said.
 
I think I misunderstood what you said. Are you saying that if {x_n} has an infinite amount of a's, then I can break it up into two subsequences, one with all a's, say {x_j}, and the other with no a's, say {x_k}. And that for both these sequences, {g(x_j)} and {g(x_k)} converge to g(a), so the original sequence must converge to g(a)?
 
Nevermind. I've figured it out. Thanks for the help.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K