1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Prove that this equation has at least one real root

  1. Jun 11, 2013 #1

    utkarshakash

    User Avatar
    Gold Member

    1. The problem statement, all variables and given/known data
    Let f:R→R be a continuous and differentiable function, then prove that the equation f'(x)+λf(x)=0 has at least one real root between any pair of roots of f(x)=0, λ being a real number

    2. Relevant equations

    3. The attempt at a solution
    All that I know from Rolle's Theorem is that between a pair of roots of f(x) there must be atleast one root of f'(x). But I can't figure out how to deal with that extra term 'λf(x)'?
     
  2. jcsd
  3. Jun 11, 2013 #2
    Hint: f'(x) = -λf(x).
     
  4. Jun 11, 2013 #3
    Do you mean f is continuously differentiable from R to R? No textbook would say a function is continuous and differentiable on a given interval because on a given interval, a differentiable function is always continuous.
     
  5. Jun 11, 2013 #4

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    That is not provided as a differential equation. The question could be worded better. It is defining a function g(x) = f'(x)+λf(x), and asks you to show that g has a root between each pair of roots of f.
     
  6. Jun 11, 2013 #5

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Hint 1: Does the form f'+λf remind you of anything?
    Hint 2: If h(x) has no roots then f(x)h(x) has the same roots as f(x).
     
  7. Jun 11, 2013 #6

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You might also imagine what the graph of log(|f|) looks like on the interval and then think about how that might relate to your problem. That might give you some intuition about the problem.
     
  8. Jun 12, 2013 #7

    utkarshakash

    User Avatar
    Gold Member

    Is it related to differential equations?
     
  9. Jun 12, 2013 #8

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    Yes.
     
  10. Jun 12, 2013 #9

    utkarshakash

    User Avatar
    Gold Member

    OK then it's time for me to wait a little because my teacher hasn't started DE yet. I wonder why he gives questions which involves DE's.
     
  11. Jun 12, 2013 #10

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    You don't have to use differential equations. Draw a sample function and sketch the graph of log(|f|).
     
  12. Jun 12, 2013 #11
    Solving the given differential equation, ##f(x)=ce^{-\lambda x}## (where c is a constant). How will this function have any roots (except ##\infty##)? :confused:
     
  13. Jun 12, 2013 #12

    verty

    User Avatar
    Homework Helper

    This fooled me too. If g(x) = 0 for all x, THEN f is that function (is in that family).
     
  14. Jun 12, 2013 #13

    verty

    User Avatar
    Homework Helper

    Hint 3: will it suffice to consider only pairs of adjacent roots of f?
     
  15. Jun 12, 2013 #14

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    As I pointed out to dirk_mec1, they have not provided us with a differential equation for f.
    Let me reword the question to make it clearer:
    (It's wrong to talk about an equation having roots. Functions have roots, equations have solutions. A root of f(x) is a solution of f(x)=0.)
    That said, you have indeed found a function that has no roots, and it is the function h(x) in my second hint.
     
  16. Jun 12, 2013 #15
    I don't know if the proposition given in the OP is true if f is differentiable but not continuously differentiable, but you can use the Intermediate Value Theorem to prove the proposition if f is continuously differentiable.
     
    Last edited: Jun 12, 2013
  17. Jun 13, 2013 #16

    verty

    User Avatar
    Homework Helper

    I was just noticing that a function like ##f(x) = e^{-\frac{1}{x^2}} * e^{-\frac{1}{(x-1)^2}}, f(0) = f(1) = 0## is a problem for this type of argument.
     
  18. Jun 13, 2013 #17

    CompuChip

    User Avatar
    Science Advisor
    Homework Helper

    Is that differentiable at x = 0, x = 1?
     
  19. Jun 13, 2013 #18

    epenguin

    User Avatar
    Homework Helper
    Gold Member

    Either I am missing something or (as it seems to me) a big meal is being made of something damned obvious. The question does assume that f have two real roots, otherwise 'between' makes no sense.

    So just consider what f'(x)+λf(x) is at one root of f and the next root of f.
     
  20. Jun 13, 2013 #19
    Look at any pair of roots. Consider the case where f'(x)+λf(x) is positive at the first root and negative at the second root. What does the Intermediate Value Theorem tell you about f'(x)+λf(x) between the two roots? This is assuming f is continuously differentiable on R.
     
  21. Jun 13, 2013 #20

    utkarshakash

    User Avatar
    Gold Member

    Since f'(x)+λf(x) has changed its sign from +ve to -ve there must be atleast one point where it became zero. Is this logic correct?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Prove that this equation has at least one real root
Loading...