Bueno
- 13
- 0
Hello everyone!
I'm having some trouble to solve the following exercise:
Supposing that $$|f(x) - f(1)|≤ (x - 1)^2$$ for every $$x $$.
Show that $$f$$ is continuous at $$ 1$$
(Sorry if the text seems a bit weird, but it's because I'm still getting used to translate all these math-related terms to english.)
I know that if f is continuous at 1, the following will be truth:
$$
0<|x-1|< \delta$$ $$⇒$$ $$|f(x) - f(1)| < \epsilon$$
I thought of choosing $$\delta = \epsilon/2(x-1)^2$$, then I would find that $$|f(x) - f(1)| < \epsilon/2 < \epsilon$$
But, as far as I know, choosing a $$\delta$$ that depends on $$x$$ is wrong.
I really don't know what to do.
Thank you,
Bueno.
I'm having some trouble to solve the following exercise:
Supposing that $$|f(x) - f(1)|≤ (x - 1)^2$$ for every $$x $$.
Show that $$f$$ is continuous at $$ 1$$
(Sorry if the text seems a bit weird, but it's because I'm still getting used to translate all these math-related terms to english.)
I know that if f is continuous at 1, the following will be truth:
$$
0<|x-1|< \delta$$ $$⇒$$ $$|f(x) - f(1)| < \epsilon$$
I thought of choosing $$\delta = \epsilon/2(x-1)^2$$, then I would find that $$|f(x) - f(1)| < \epsilon/2 < \epsilon$$
But, as far as I know, choosing a $$\delta$$ that depends on $$x$$ is wrong.
I really don't know what to do.
Thank you,
Bueno.