Prove that this equation has at least one real root

  • Thread starter Thread starter utkarshakash
  • Start date Start date
  • Tags Tags
    Root
Click For Summary

Homework Help Overview

The problem involves proving that the equation f'(x) + λf(x) = 0 has at least one real root between any pair of roots of the function f(x) = 0, where f is a continuous and differentiable function and λ is a real number. The discussion centers around the implications of Rolle's Theorem and the properties of continuous functions.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Problem interpretation, Assumption checking

Approaches and Questions Raised

  • Participants discuss the relationship between the roots of f(x) and the behavior of its derivative f'(x) in the context of the additional term λf(x). There are attempts to clarify the definitions and implications of the problem statement, as well as to explore the connection to differential equations.

Discussion Status

Participants are actively engaging with the problem, raising questions about the assumptions made regarding the continuity and differentiability of f. Some hints and suggestions have been offered, including the use of the Intermediate Value Theorem and considerations of the function's behavior at its roots. There is a recognition of the complexity involved in applying Rolle's Theorem directly to the modified function g(x).

Contextual Notes

There is a discussion about the clarity of the problem statement and the implications of differentiability. Some participants express uncertainty regarding the assumptions about the function f and its roots, particularly in relation to the continuity of f' and the nature of the differential equation presented.

utkarshakash
Gold Member
Messages
852
Reaction score
13

Homework Statement


Let f:R→R be a continuous and differentiable function, then prove that the equation f'(x)+λf(x)=0 has at least one real root between any pair of roots of f(x)=0, λ being a real number

Homework Equations



The Attempt at a Solution


All that I know from Rolle's Theorem is that between a pair of roots of f(x) there must be atleast one root of f'(x). But I can't figure out how to deal with that extra term 'λf(x)'?
 
Physics news on Phys.org
Hint: f'(x) = -λf(x).
 
Do you mean f is continuously differentiable from R to R? No textbook would say a function is continuous and differentiable on a given interval because on a given interval, a differentiable function is always continuous.
 
dirk_mec1 said:
Hint: f'(x) = -λf(x).
That is not provided as a differential equation. The question could be worded better. It is defining a function g(x) = f'(x)+λf(x), and asks you to show that g has a root between each pair of roots of f.
 
Hint 1: Does the form f'+λf remind you of anything?
Hint 2: If h(x) has no roots then f(x)h(x) has the same roots as f(x).
 
You might also imagine what the graph of log(|f|) looks like on the interval and then think about how that might relate to your problem. That might give you some intuition about the problem.
 
haruspex said:
Hint 1: Does the form f'+λf remind you of anything?
Hint 2: If h(x) has no roots then f(x)h(x) has the same roots as f(x).

Is it related to differential equations?
 
utkarshakash said:
Is it related to differential equations?
Yes.
 
haruspex said:
Yes.

OK then it's time for me to wait a little because my teacher hasn't started DE yet. I wonder why he gives questions which involves DE's.
 
  • #10
utkarshakash said:
OK then it's time for me to wait a little because my teacher hasn't started DE yet. I wonder why he gives questions which involves DE's.

You don't have to use differential equations. Draw a sample function and sketch the graph of log(|f|).
 
  • #11
haruspex said:
Yes.

Solving the given differential equation, ##f(x)=ce^{-\lambda x}## (where c is a constant). How will this function have any roots (except ##\infty##)? :confused:
 
  • #12
Pranav-Arora said:
Solving the given differential equation, ##f(x)=ce^{-\lambda x}## (where c is a constant). How will this function have any roots (except ##\infty##)? :confused:

This fooled me too. If g(x) = 0 for all x, THEN f is that function (is in that family).
 
  • #13
Hint 3: will it suffice to consider only pairs of adjacent roots of f?
 
  • #14
Pranav-Arora said:
Solving the given differential equation, ##f(x)=ce^{-\lambda x}## (where c is a constant). How will this function have any roots (except ##\infty##)? :confused:

As I pointed out to dirk_mec1, they have not provided us with a differential equation for f.
Let me reword the question to make it clearer:
Let f:R→R be a continuous and differentiable function, then prove that the function g(x) = f'(x)+λf(x) has at least one real root between any pair of roots of f(x), λ being a real number.
(It's wrong to talk about an equation having roots. Functions have roots, equations have solutions. A root of f(x) is a solution of f(x)=0.)
That said, you have indeed found a function that has no roots, and it is the function h(x) in my second hint.
 
  • #15
I don't know if the proposition given in the OP is true if f is differentiable but not continuously differentiable, but you can use the Intermediate Value Theorem to prove the proposition if f is continuously differentiable.
 
Last edited:
  • #16
I was just noticing that a function like ##f(x) = e^{-\frac{1}{x^2}} * e^{-\frac{1}{(x-1)^2}}, f(0) = f(1) = 0## is a problem for this type of argument.
 
  • #17
Is that differentiable at x = 0, x = 1?
 
  • #18
Either I am missing something or (as it seems to me) a big meal is being made of something damned obvious. The question does assume that f have two real roots, otherwise 'between' makes no sense.

So just consider what f'(x)+λf(x) is at one root of f and the next root of f.
 
  • #19
Look at any pair of roots. Consider the case where f'(x)+λf(x) is positive at the first root and negative at the second root. What does the Intermediate Value Theorem tell you about f'(x)+λf(x) between the two roots? This is assuming f is continuously differentiable on R.
 
  • #20
shortydeb said:
Look at any pair of roots. Consider the case where f'(x)+λf(x) is positive at the first root and negative at the second root. What does the Intermediate Value Theorem tell you about f'(x)+λf(x) between the two roots? This is assuming f is continuously differentiable on R.

Since f'(x)+λf(x) has changed its sign from +ve to -ve there must be atleast one point where it became zero. Is this logic correct?
 
  • #21
utkarshakash said:
Since f'(x)+λf(x) has changed its sign from +ve to -ve there must be atleast one point where it became zero. Is this logic correct?

Exactly, it is almost the same thing as your Rolle's theorem of #1 .
 
  • #22
epenguin and shortydeb, how are you proposing to show that f' switches sign at consecutive roots of f? It does, of course, but I don't see this as completely trivial.
 
  • #23
haruspex said:
epenguin and shortydeb, how are you proposing to show that f' switches sign at consecutive roots of f? It does, of course, but I don't see this as completely trivial.

You are asking for a proof of Rolle's theorem?

It seemed the OP was assuming it as something already known.

(Mathematicians and non-mathematicians probably part company about whether they want one. :biggrin:)
 
  • #24
epenguin said:
You are asking for a proof of Rolle's theorem?
Are we using different statements of Rolle's theorem, or are you using it i some clever way I'm not seeing?
As I understand it, it says that
a differentiable function which takes the same value at two distinct points must have a point somewhere between them where the first derivative is zero.
If α and β are consecutive roots of f then the function g(x) = f'(x) + λf(x) does not necessarily take the same value at those two roots, and anyway you're not interested in g'. You can apply Rolle's theorem to f, but that only tells you f' is zero somewhere between; it does not tell you g has a root in between.
 
  • #25
haruspex said:
Are we using different statements of Rolle's theorem, or are you using it i some clever way I'm not seeing?
As I understand it, it says that If α and β are consecutive roots of f then the function g(x) = f'(x) + λf(x) does not necessarily take the same value at those two roots, and anyway you're not interested in g'. You can apply Rolle's theorem to f, but that only tells you f' is zero somewhere between; it does not tell you g has a root in between.

If a and b are two consecutive roots then look at log(|f|) on the interval (a,b). It approaches -infinity at a and -infinity at b and it has a finite value in between. Sketch a graph. Think about its slope f'/f and use the MVT (which is a form of Rolle's theorem). Mustn't it take on all real values? I guess I'm not seeing what all the fuss is about.
 
Last edited:
  • #26
Dick said:
If a and b are two consecutive roots then look at log(|f|) on the interval (a,b). It approaches -infinity at a and -infinity at b and it has a finite value in between. Sketch a graph. Think about its slope f'/f and use the MVT (which is a form of Rolle's theorem). Mustn't it take on all real values? I guess I'm not seeing what all the fuss is about.

Sure, but epenguin and shortydeb seem to be saying it's much easier even than that.
Fwiw, the solution I posted hints for is to consider f(x)eλx. This must have the same set of roots as f, and its derivative must have the same set of roots as f'+λf.
 
  • #27
haruspex said:
Sure, but epenguin and shortydeb seem to be saying it's much easier even than that.
Fwiw, the solution I posted hints for is to consider f(x)eλx. This must have the same set of roots as f, and its derivative must have the same set of roots as f'+λf.

Oh, I see what you are up to. Apply Rolle's to f(x)e^(λx). I kind of like the MVT approach because that doesn't need too much cleverness. But that's still nice.
 
Last edited:
  • #28
haruspex said:
Are we using different statements of Rolle's theorem, or are you using it i some clever way I'm not seeing?
As I understand it, it says that If α and β are consecutive roots of f then the function g(x) = f'(x) + λf(x) does not necessarily take the same value at those two roots, and anyway you're not interested in g'. You can apply Rolle's theorem to f, but that only tells you f' is zero somewhere between; it does not tell you g has a root in between.

Between successive roots (which we should consider, as pointed out by vertex and others) of f(x), f'(x) changes sign*. At any root of f(x) , g(x) = f'(x) . So between roots of f, g changes sign. If g changes sign in an interval, it has a root in that interval. QED.

*(There may be different versions of Rolle's theorem. I often use it but never name it - I often have to look it up when anyone mentions it. The reason is I can never quite believe that such an obvious statement got honoured with the name 'theorem' so I need to check it is what I think it is.)
 
  • #29
epenguin said:
Between successive roots (which we should consider, as pointed out by vertex and others) of f(x), f'(x) changes sign*.
Sorry, but I can't find that. Would you mind providing the number of the post? I did find this from utkarshakash
Consider the case where f'(x)+λf(x) is positive at the first root and negative at the second root.
but that is assuming a change of sign, not proving one. Maybe you're thinking of another theorem.
 
  • Like
Likes   Reactions: 1 person
  • #30
haruspex said:
Sorry, but I can't find that. Would you mind providing the number of the post? I did find this from utkarshakash

Oops, verty #13.

Anyway it looks like the OP has got it, #20, not quite explicit.
 

Similar threads

  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 105 ·
4
Replies
105
Views
11K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K