Assume f(0)=f'(0)=0, prove there exists a positive constant such that f(x)>o

  • Thread starter Thread starter NWeid1
  • Start date Start date
  • Tags Tags
    Constant Positive
NWeid1
Messages
81
Reaction score
0

Homework Statement


Assume that f is a differentiable function such that f(0)=f'(0)=0 and f''(0)>0. Argue that there exists a positive constant a>0 such that f(x)>0 for all x in the interval (0,a). Can anything be concluded about f(x) for negative x's?


Homework Equations





The Attempt at a Solution


I think I should use the MVT so here is what I tried:

f'(c) = \frac{f(a) - f(0)}{a-0}
f(0)=0 is given so:
f'(c) = \frac{f(a)}{a}

Now I am confused on how to relate this to f(x)>0.
 
Physics news on Phys.org
Think about what f''(0)>0 means for f'(x) in the immediate vicinity of 0.
 
Does it mean that it is increasing? So f'(x)>0? Which means f(x) is increasing from f(0) which means f(x)>0 for a near 0?
 
... for x in some region just greater than 0, yes.
 
NWeid1 said:
Does it mean that it is increasing? So f'(x)>0? Which means f(x) is increasing from f(0) which means f(x)>0 for a near 0?

Assuming what you mean by "it is increasing" is "f'(x) is increasing near 0", yes. Good intuition, but of course, that is what you are supposed to prove.
 
NWeid1 said:
Does it mean that it is increasing? So f'(x)>0? Which means f(x) is increasing from f(0) which means f(x)>0 for a near 0?
What you say will be clearer if you minimize the number of pronouns such as "it". The question involved f'' and f'. Which one of these do you mean by "it?"
 
Ok, I think I got it.

If a>0, and f''(0)>0 means f'(0) will be increasing so f'(x)>0 which means f(x) is increasing the origin and therefore f(x)>0. And since a>0 and f(a)>0

f'(c) = f(a)/a > 0
and therefore f(a) > 0. Is this right?
 
NWeid1 said:
Ok, I think I got it.

If a>0, and f''(0)>0 means f'(0) will be increasing so f'(x)>0
This is not necessarily true. For example, if f(x) = x2, f''(x) > 0 for all x, but the graph of y = f(x) is decreasing over half of its domain.
NWeid1 said:
which means f(x) is increasing the origin and therefore f(x)>0. And since a>0 and f(a)>0

f'(c) = f(a)/a > 0
and therefore f(a) > 0. Is this right?
 
Mark44 said:
This is not necessarily true. For example, if f(x) = x2, f''(x) > 0 for all x, but the graph of y = f(x) is decreasing over half of its domain.

Yeah but since we're only looking at x>0, wouldn't it work? So confused, ugh! lol
 
  • #10
That was just an example. My point is that f''(x) being positive doesn't necessarily mean that f is increasing.

If you want an example where x > 0, consider f(x) = (x - 10)2. f''(x) = 2 > 0, but there is an interval, namely [0, 10], on which the graph is decreasing.
 
  • #11
Ok. I got it now. So now I'm confused. I see now that i used f(a)>0 and a>0 to prove that f(a)>0 lol...But, since f''(0)>0, f'(x) will be increasing at 0, right? And if f'(0)=0 and it is increasing, when x>0 for x near 0, f'(x)>0. Is this right at all? lol
 
  • #12
NWeid1 said:
Ok. I got it now. So now I'm confused. I see now that i used f(a)>0 and a>0 to prove that f(a)>0 lol...But, since f''(0)>0, f'(x) will be increasing at 0, right?
f'(x) will be increasing in some interval around 0. Increasing applies to an interval, not just a single point.
NWeid1 said:
And if f'(0)=0 and it is increasing, when x>0 for x near 0, f'(x)>0. Is this right at all? lol
Who is "it"?
 
Back
Top