# Rolle's theorem

1. Oct 16, 2012

### Tomp

I'm doing a question and I am getting stuck and need help. I am not sure where to start or how to do a proof for this question. We have not done a question like this in class before.

1. The problem statement, all variables and given/known data

Question:
Consider the continuous functions f(x) = 1 - e^(x)*sin(x) and g(x) = 1 + e^(x)*cos(x). Using Rolle's Theorem, prove that between any two roots of f there exists at least one root of g.

Hint
Remember that, a root of f is a point x in the domain of f such that f(x) = 0.

2. Oct 16, 2012

### Zondrina

Okay, first things first. State the Rolle conditions as they pertain to f and what the Rolle conditions imply.

3. Oct 16, 2012

### Tomp

f must be continuous between two roots [a,b] and also differentiable on (a,b)....f(a)=f(b)=0 (roots) there exists c in (a,b) such that f'(c) = 0 ....

4. Oct 16, 2012

### Zondrina

What you said here is incorrect.

[a,b] is not a root, it's an interval of definition. The rest looks good though. So f is continuous on the closed interval, and differentiable on the open interval. My only question is, are you GIVEN that f(a) = f(b) = 0? You cannot automatically assume this.

5. Oct 16, 2012

### Tomp

This doesn't necessarily need to be true, f(a) can just equal f(b), but i assumed that they equaled zero as it's asling about the roots

6. Oct 16, 2012

### Zondrina

Usually you're given a condition on f(a) or f(b) or both. Sometimes you're even given conditions about the derivatives like f'(a) = 0.

For your two functions, you can't assume that f(a) = f(b).

7. Oct 16, 2012

### Zondrina

Here's why I'm saying what I'm saying :

Consider :

f(x) = (x-a)2(x-b)

Then we can clearly see f(a) = f(b), but with your functions unless stated otherwise you cannot assume f(a) = f(b).

8. Oct 16, 2012

### Tomp

I understand what you are saying. This is all that was given in the question and now I am stumped!

9. Oct 16, 2012

### Zondrina

Hmm okay then... Given the question, I suppose you could then start by assuming that f has roots within the interval [a,b]. Say r1, r2$\in$[a,b] so that f(r1) = f(r2) = 0.

Now using the Rolle conditions, what does this tell you?

10. Oct 16, 2012

### Tomp

That there exists a x0 between (r1,r2) such that f'(x0) = 0

11. Oct 16, 2012

### Zondrina

Okay, good. So now you have an interval a < r1 < x0 < r2 < b.

Now, you have to do a little bit of work showing that the curve of g(x) crosses the x-axis somewhere in this interval ( Hint ).

Perhaps taking derivatives and analyzing a few things would be appropriate.