1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Rolle's theorem

  1. Oct 16, 2012 #1
    I'm doing a question and I am getting stuck and need help. I am not sure where to start or how to do a proof for this question. We have not done a question like this in class before.

    1. The problem statement, all variables and given/known data

    Question:
    Consider the continuous functions f(x) = 1 - e^(x)*sin(x) and g(x) = 1 + e^(x)*cos(x). Using Rolle's Theorem, prove that between any two roots of f there exists at least one root of g.

    Hint
    Remember that, a root of f is a point x in the domain of f such that f(x) = 0.


    Where would you start with this proof?
     
  2. jcsd
  3. Oct 16, 2012 #2

    Zondrina

    User Avatar
    Homework Helper

    Okay, first things first. State the Rolle conditions as they pertain to f and what the Rolle conditions imply.
     
  4. Oct 16, 2012 #3
    f must be continuous between two roots [a,b] and also differentiable on (a,b)....f(a)=f(b)=0 (roots) there exists c in (a,b) such that f'(c) = 0 ....
     
  5. Oct 16, 2012 #4

    Zondrina

    User Avatar
    Homework Helper

    What you said here is incorrect.

    [a,b] is not a root, it's an interval of definition. The rest looks good though. So f is continuous on the closed interval, and differentiable on the open interval. My only question is, are you GIVEN that f(a) = f(b) = 0? You cannot automatically assume this.
     
  6. Oct 16, 2012 #5
    This doesn't necessarily need to be true, f(a) can just equal f(b), but i assumed that they equaled zero as it's asling about the roots
     
  7. Oct 16, 2012 #6

    Zondrina

    User Avatar
    Homework Helper

    Usually you're given a condition on f(a) or f(b) or both. Sometimes you're even given conditions about the derivatives like f'(a) = 0.

    For your two functions, you can't assume that f(a) = f(b).
     
  8. Oct 16, 2012 #7

    Zondrina

    User Avatar
    Homework Helper

    Here's why I'm saying what I'm saying :

    Consider :

    f(x) = (x-a)2(x-b)

    Then we can clearly see f(a) = f(b), but with your functions unless stated otherwise you cannot assume f(a) = f(b).
     
  9. Oct 16, 2012 #8
    I understand what you are saying. This is all that was given in the question and now I am stumped!
     
  10. Oct 16, 2012 #9

    Zondrina

    User Avatar
    Homework Helper

    Hmm okay then... Given the question, I suppose you could then start by assuming that f has roots within the interval [a,b]. Say r1, r2[itex]\in[/itex][a,b] so that f(r1) = f(r2) = 0.

    Now using the Rolle conditions, what does this tell you?
     
  11. Oct 16, 2012 #10
    That there exists a x0 between (r1,r2) such that f'(x0) = 0
     
  12. Oct 16, 2012 #11

    Zondrina

    User Avatar
    Homework Helper

    Okay, good. So now you have an interval a < r1 < x0 < r2 < b.

    Now, you have to do a little bit of work showing that the curve of g(x) crosses the x-axis somewhere in this interval ( Hint ).

    Perhaps taking derivatives and analyzing a few things would be appropriate.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Rolle's theorem
  1. Rolle's Theorem (Replies: 2)

  2. Rolle's Theorem (Replies: 3)

  3. Rolle's Theorem (Replies: 11)

  4. Rolle's Theorem (Replies: 1)

  5. Rolles Theorem (Replies: 2)

Loading...