Doubt with a theorem that if f is cont. at interval it has max

Hernaner28
Messages
261
Reaction score
0
I THINK this theorem in short words states that if a function is continuous in an interval [a,b] then it has a maximum. But I have a doubt with the demonstration when it supposes an absurd:

$$\begin{align}
& \text{THEOREM:} \\
& \text{ }f:[a,b]\to \mathbb{R}\text{ continuous} \\
& \text{then }\exists c\in [a,b]:f(c)\ge f(x)\forall x\in [a,b]\text{ } \\
& \text{ } \\
& \text{Dem}\text{. } \\
& \text{We know that }\exists \alpha =\sup f([a,b])\text{. } \\
& \text{Suppose as and absurd that }\forall x\in [a,b]\text{ }f(x)<\alpha \Rightarrow \alpha -f(x)>0 \\
& \text{ }...\text{continues}... \\
\end{align}$$

What I don't understand is... if alpha is sup then is the least upper bound, so wouldn't the absurd be that f(x) is GREATER than alpha, i.e. greater than a supposed number which we know is supremum?Thanks!
 
Physics news on Phys.org
If you start by assuming that f(x) is larger than alpha you assumed something that was false independent of anything to do with the theorem. The point is to assume something that could be true if it wasn't for the theorem you are trying to prove. In this case you are assuming that f never takes on the value of alpha, something which can happen in general but cannot happen in this specific situation
 
Ah I get it now! Thank you very much!
 

Similar threads

Back
Top