Why is a finite sub-cover necessary for proving continuity implies boundedness?

Click For Summary
The discussion centers on proving that a continuous function on a closed interval is bounded, emphasizing the necessity of a finite sub-cover in this proof. It references the Heine-Borel theorem, which guarantees that every open cover of a compact set has a finite sub-cover. The need for this finite sub-cover arises because an infinite collection of bounds may not yield a maximum, as the bounds could increase indefinitely. A counterexample is provided with a function defined on a half-open interval, demonstrating that without compactness, the proof fails due to the absence of a finite sub-cover. Thus, the finite sub-cover is essential for establishing that the function is bounded on the interval.
Oats
Messages
11
Reaction score
1
1. The problem statement:
Let ##f:[a, b] \rightarrow \mathbb{R}##. Prove that if ##f## is continuous, then ##f## is bounded.

2. Relevant Information
This is the previous exercise.
Let ##A \subseteq \mathbb{R}## , let ##f: A \rightarrow \mathbb{R}##, and let ##c \in A##. Prove that if ##f## is continuous at ##c##, then there is some ##\delta > 0## such that ##f|A \cap (c - \delta, c + \delta)## is bounded.
I have already proved this result, and the book states to use it to prove the next exercise. It also hints to use the Heine-Borel theorem.

The Attempt at a Solution

:[/B]
Since ##f## is continuous, for each ##c \in [a, b]##, ##f## is continuous at ##c##. By the previous exercise, for each ##c \in [a, b]##, there is ##\delta_c > 0## such that ##f|A \cap (c - \delta, c + \delta)## is bounded, say by ##K_c##. Since, for each ##c \in [a, b]##, ##c \in (c - \delta_c, c + \delta_c)##, we have that the collection ##\{(i - \delta_i, i + \delta_i)\}_{i \in [a, b]}## forms an open cover of ##[a, b]##. By the Heine-Borel theorem, this collection has a finite subcover. That is, there exists ##n \in \mathbb{N}## and ##q_1, \ldots, q_n \in [a, b]## for which ##(q_1 - \delta_{q_1}, q_1 + \delta_{q_1}), \ldots, (q_n - \delta_{q_n}, q_n + \delta_{q_n})## form an open cover for ##[a, b]##, and each is bounded by ##K_{q_1}, \ldots, K_{q_n}## respectively. Now take ##K = \text{max}\{K_{q_1}, \ldots, K_{q_n}\}##. Let ##x \in [a, b]##. Then there is ##q_h##, for ##h \in \{1, \ldots, n\}##, for which ##x \in (q_h - \delta_{q_h}, q_h + \delta_{q_h})##, so that ##|f(x)| \leq K_{q_h} \leq K##. Hence, ##f## is bounded by ##K##.

I feel quite confident in most of the proof, but at the beginning I was feeling a little iffy on exactly why we need a reduction on the cover of ##[a, b]## to a finite one. I immediately sensed that that's what they where after with the hint to use the Heine-Borel theorem, but the actual necessity for the reduction itself is what irked me. I was wondering why we couldn't simply let ##K = \text{max}\{K_c\}_{c \in [a, b]}##. Since then, I began to question that reasoning with the following response, of which I would greatly appreciate feedback in determining if I am right: The reason we cannot simply choose the bound from an infinite collection of bounds, is that the bounds may be increasing. Sure, the function may be bounded around arbitrarily small neighborhoods around each point, but there are unaccountably many of these neighborhoods, and there is nothing stopping the possibility that the function simply keeps continually increasing that bound as ##c## increases in ##[a, b]##. However, with the finitely many guaranteed by the Heine-Borel theorem, there would have to be a largest.
 
Physics news on Phys.org
Oats said:
However, with the finitely many guaranteed by the Heine-Borel theorem, there would have to be a largest.
Yes, that's the reason.

If you want to try to solidify your intuition further, try to prove the theorem for the function ##f:[a,b)\to \mathbb R##, ie where the domain is a half-open interval. It can't be done. A counterexample is the function ##f(x)=\frac1{b-x}##. It increases without limit as ##x\to b##. So the ##K_c##s will have no upper bound. The reason the proof doesn't work in this case is that a half-open interval is not compact and so not every cover has a finite sub-cover. In particular, the cover defined in the proof will have no finite sub-cover.
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K