Using Rolle's Theorem: Proving Existence of Roots Between Two Rooted Functions

  • Thread starter Thread starter Tomp
  • Start date Start date
  • Tags Tags
    Theorem
Click For Summary

Homework Help Overview

The discussion revolves around proving the existence of roots of the function g(x) between the roots of the function f(x) using Rolle's Theorem. The functions in question are f(x) = 1 - e^(x)*sin(x) and g(x) = 1 + e^(x)*cos(x), and the original poster expresses uncertainty about how to approach the proof, particularly in relation to the conditions of Rolle's Theorem.

Discussion Character

  • Exploratory, Assumption checking

Approaches and Questions Raised

  • Participants discuss the conditions required for applying Rolle's Theorem, particularly the continuity and differentiability of f on the interval defined by its roots. There is a focus on whether it can be assumed that f(a) = f(b) = 0, with some participants questioning this assumption.

Discussion Status

There is an ongoing exploration of the assumptions necessary for applying Rolle's Theorem. Some participants have provided guidance on the conditions that must be met, while others are still grappling with the implications of the given problem statement. The discussion reflects a mix of interpretations regarding the roots of f and the application of the theorem.

Contextual Notes

Participants note that the problem does not explicitly state that f(a) and f(b) are equal to zero, leading to uncertainty about the assumptions that can be made. The original poster acknowledges the limitations of the information provided in the question.

Tomp
Messages
27
Reaction score
0
I'm doing a question and I am getting stuck and need help. I am not sure where to start or how to do a proof for this question. We have not done a question like this in class before.

Homework Statement



Question:
Consider the continuous functions f(x) = 1 - e^(x)*sin(x) and g(x) = 1 + e^(x)*cos(x). Using Rolle's Theorem, prove that between any two roots of f there exists at least one root of g.

Hint
Remember that, a root of f is a point x in the domain of f such that f(x) = 0.


Where would you start with this proof?
 
Physics news on Phys.org
Okay, first things first. State the Rolle conditions as they pertain to f and what the Rolle conditions imply.
 
Zondrina said:
Okay, first things first. State the Rolle conditions as they pertain to f and what the Rolle conditions imply.

f must be continuous between two roots [a,b] and also differentiable on (a,b)...f(a)=f(b)=0 (roots) there exists c in (a,b) such that f'(c) = 0 ...
 
Tomp said:
f must be continuous between two roots [a,b] and also differentiable on (a,b)...f(a)=f(b)=0 (roots) there exists c in (a,b) such that f'(c) = 0 ...

What you said here is incorrect.

f must be continuous between two roots [a,b]

[a,b] is not a root, it's an interval of definition. The rest looks good though. So f is continuous on the closed interval, and differentiable on the open interval. My only question is, are you GIVEN that f(a) = f(b) = 0? You cannot automatically assume this.
 
Zondrina said:
What you said here is incorrect.



[a,b] is not a root, it's an interval of definition. The rest looks good though. So f is continuous on the closed interval, and differentiable on the open interval. My only question is, are you GIVEN that f(a) = f(b) = 0? You cannot automatically assume this.

This doesn't necessarily need to be true, f(a) can just equal f(b), but i assumed that they equaled zero as it's asling about the roots
 
Tomp said:
This doesn't necessarily need to be true, f(a) can just equal f(b), but i assumed that they equaled zero as it's asling about the roots

Usually you're given a condition on f(a) or f(b) or both. Sometimes you're even given conditions about the derivatives like f'(a) = 0.

For your two functions, you can't assume that f(a) = f(b).
 
Here's why I'm saying what I'm saying :

Consider :

f(x) = (x-a)2(x-b)

Then we can clearly see f(a) = f(b), but with your functions unless stated otherwise you cannot assume f(a) = f(b).
 
Zondrina said:
Here's why I'm saying what I'm saying :

Consider :

f(x) = (x-a)2(x-b)

Then we can clearly see f(a) = f(b), but with your functions unless stated otherwise you cannot assume f(a) = f(b).

I understand what you are saying. This is all that was given in the question and now I am stumped!
 
Tomp said:
I understand what you are saying. This is all that was given in the question and now I am stumped!

Hmm okay then... Given the question, I suppose you could then start by assuming that f has roots within the interval [a,b]. Say r1, r2[itex]\in[/itex][a,b] so that f(r1) = f(r2) = 0.

Now using the Rolle conditions, what does this tell you?
 
  • #10
Zondrina said:
Hmm okay then... Given the question, I suppose you could then start by assuming that f has roots within the interval [a,b]. Say r1, r2[itex]\in[/itex][a,b] so that f(r1) = f(r2) = 0.

Now using the Rolle conditions, what does this tell you?

That there exists a x0 between (r1,r2) such that f'(x0) = 0
 
  • #11
Okay, good. So now you have an interval a < r1 < x0 < r2 < b.

Now, you have to do a little bit of work showing that the curve of g(x) crosses the x-axis somewhere in this interval ( Hint ).

Perhaps taking derivatives and analyzing a few things would be appropriate.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 105 ·
4
Replies
105
Views
11K
  • · Replies 1 ·
Replies
1
Views
2K