# Homework Help: Prove that this equation has at least one real root

1. Jun 11, 2013

### utkarshakash

1. The problem statement, all variables and given/known data
Let f:R→R be a continuous and differentiable function, then prove that the equation f'(x)+λf(x)=0 has at least one real root between any pair of roots of f(x)=0, λ being a real number

2. Relevant equations

3. The attempt at a solution
All that I know from Rolle's Theorem is that between a pair of roots of f(x) there must be atleast one root of f'(x). But I can't figure out how to deal with that extra term 'λf(x)'?

2. Jun 11, 2013

### dirk_mec1

Hint: f'(x) = -λf(x).

3. Jun 11, 2013

### shortydeb

Do you mean f is continuously differentiable from R to R? No textbook would say a function is continuous and differentiable on a given interval because on a given interval, a differentiable function is always continuous.

4. Jun 11, 2013

### haruspex

That is not provided as a differential equation. The question could be worded better. It is defining a function g(x) = f'(x)+λf(x), and asks you to show that g has a root between each pair of roots of f.

5. Jun 11, 2013

### haruspex

Hint 1: Does the form f'+λf remind you of anything?
Hint 2: If h(x) has no roots then f(x)h(x) has the same roots as f(x).

6. Jun 11, 2013

### Dick

You might also imagine what the graph of log(|f|) looks like on the interval and then think about how that might relate to your problem. That might give you some intuition about the problem.

7. Jun 12, 2013

### utkarshakash

Is it related to differential equations?

8. Jun 12, 2013

### haruspex

Yes.

9. Jun 12, 2013

### utkarshakash

OK then it's time for me to wait a little because my teacher hasn't started DE yet. I wonder why he gives questions which involves DE's.

10. Jun 12, 2013

### Dick

You don't have to use differential equations. Draw a sample function and sketch the graph of log(|f|).

11. Jun 12, 2013

### Saitama

Solving the given differential equation, $f(x)=ce^{-\lambda x}$ (where c is a constant). How will this function have any roots (except $\infty$)?

12. Jun 12, 2013

### verty

This fooled me too. If g(x) = 0 for all x, THEN f is that function (is in that family).

13. Jun 12, 2013

### verty

Hint 3: will it suffice to consider only pairs of adjacent roots of f?

14. Jun 12, 2013

### haruspex

As I pointed out to dirk_mec1, they have not provided us with a differential equation for f.
Let me reword the question to make it clearer:
(It's wrong to talk about an equation having roots. Functions have roots, equations have solutions. A root of f(x) is a solution of f(x)=0.)
That said, you have indeed found a function that has no roots, and it is the function h(x) in my second hint.

15. Jun 12, 2013

### shortydeb

I don't know if the proposition given in the OP is true if f is differentiable but not continuously differentiable, but you can use the Intermediate Value Theorem to prove the proposition if f is continuously differentiable.

Last edited: Jun 12, 2013
16. Jun 13, 2013

### verty

I was just noticing that a function like $f(x) = e^{-\frac{1}{x^2}} * e^{-\frac{1}{(x-1)^2}}, f(0) = f(1) = 0$ is a problem for this type of argument.

17. Jun 13, 2013

### CompuChip

Is that differentiable at x = 0, x = 1?

18. Jun 13, 2013

### epenguin

Either I am missing something or (as it seems to me) a big meal is being made of something damned obvious. The question does assume that f have two real roots, otherwise 'between' makes no sense.

So just consider what f'(x)+λf(x) is at one root of f and the next root of f.

19. Jun 13, 2013

### shortydeb

Look at any pair of roots. Consider the case where f'(x)+λf(x) is positive at the first root and negative at the second root. What does the Intermediate Value Theorem tell you about f'(x)+λf(x) between the two roots? This is assuming f is continuously differentiable on R.

20. Jun 13, 2013

### utkarshakash

Since f'(x)+λf(x) has changed its sign from +ve to -ve there must be atleast one point where it became zero. Is this logic correct?

21. Jun 13, 2013

### epenguin

Exactly, it is almost the same thing as your Rolle's theorem of #1 .

22. Jun 13, 2013

### haruspex

epenguin and shortydeb, how are you proposing to show that f' switches sign at consecutive roots of f? It does, of course, but I don't see this as completely trivial.

23. Jun 13, 2013

### epenguin

You are asking for a proof of Rolle's theorem?

It seemed the OP was assuming it as something already known.

(Mathematicians and non-mathematicians probably part company about whether they want one. )

24. Jun 13, 2013

### haruspex

Are we using different statements of Rolle's theorem, or are you using it i some clever way I'm not seeing?
As I understand it, it says that
If α and β are consecutive roots of f then the function g(x) = f'(x) + λf(x) does not necessarily take the same value at those two roots, and anyway you're not interested in g'. You can apply Rolle's theorem to f, but that only tells you f' is zero somewhere between; it does not tell you g has a root in between.

25. Jun 13, 2013

### Dick

If a and b are two consecutive roots then look at log(|f|) on the interval (a,b). It approaches -infinity at a and -infinity at b and it has a finite value in between. Sketch a graph. Think about its slope f'/f and use the MVT (which is a form of Rolle's theorem). Mustn't it take on all real values? I guess I'm not seeing what all the fuss is about.

Last edited: Jun 13, 2013