Minimum value on an open continuous function

Click For Summary
SUMMARY

The discussion centers on proving that a continuous function f defined on the interval (a,b) has a minimum value, given that the limits as x approaches a from the right and b from the left both approach infinity. Participants suggest using sequences to demonstrate the behavior of f and emphasize the importance of continuity in establishing the existence of a minimum. The conversation highlights the necessity of applying mathematical theorems related to continuity to clarify the function's behavior away from the endpoints.

PREREQUISITES
  • Understanding of continuous functions and their properties
  • Familiarity with limits and the concept of infinity in calculus
  • Knowledge of mathematical theorems related to continuity
  • Experience with constructing sequences and their limits
NEXT STEPS
  • Study the Bolzano-Weierstrass theorem and its implications for continuous functions
  • Learn about the Extreme Value Theorem and its application to continuous functions on closed intervals
  • Explore the concept of compactness in relation to continuous functions
  • Investigate examples of continuous functions that approach infinity at the endpoints
USEFUL FOR

Mathematics students, calculus learners, and educators seeking to understand the properties of continuous functions and their behavior in relation to limits and minimum values.

54stickers
Messages
28
Reaction score
0

Homework Statement



Suppose that f is a continuous function on (a,b) and lim_{x \rightarrow a^{+}} f(x) = lim_{x \rightarrow b^{-}} f(x) = \infty. prove that f has a minimum on all of (a,b)



The Attempt at a Solution



I have not tried an actual attempt yet. The only think I can think of doing is making two sequences that approach a common point on the domain of f. One sequence starting at a, and the other starting at b. Then show that the range of these sequences is decreasing and tends to the same value. This seems a bit too complicated to me for such a problem.

I am interested in where to start. Logically, it makes sense to me that there should be a minimum. I just don't know how to explain it using math.

Thanks.
 
Physics news on Phys.org
A good way to think about these problems is that you have a couple different regions: if x is close to a, or close to b, then you know f(x) is really big. If f is not close to either of those, all you really know is that f is continuous. So you need to use something (probably a theorem) involving the continuity of f to make a statement about f in this region that I have vaguely described as "not close to a or b". You of course should make that description mathematically more precise first!
 
If you start at a and travel towards b, you will have that ##f(x) < lim_{x \rightarrow a^+}f(x) = \infty##. Yet by the time you get back to ##b^-## you are back at ## \infty ##. How did that happen?

If you are not sure where the continuity fits in, try constructing a function which goes to ## \infty## at a and b, but does not have a minimum on (a,b).
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
34
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
7
Views
2K
Replies
30
Views
3K
  • · Replies 20 ·
Replies
20
Views
3K