Proving the Relationship Between Roots of Two Equations

  • Thread starter Thread starter tysonk
  • Start date Start date
  • Tags Tags
    Roots
Click For Summary

Homework Help Overview

The discussion revolves around proving a relationship between the roots of two functions, f(x) and g(x), under the condition that their derivatives are continuous and a specific expression involving these functions does not equal zero. The goal is to establish that there is exactly one root of g(x) between two consecutive roots of f(x).

Discussion Character

  • Exploratory, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the implications of the expression h(x) = f(x) g'(x) - g(x) f'(x) and its non-zero condition. They explore the behavior of g(x) based on the signs of h(x) at the roots of f(x) and consider the implications of the derivative f' at those points.

Discussion Status

The discussion is ongoing, with participants sharing insights and attempting to clarify the relationship between the roots of f(x) and g(x). Some guidance has been offered regarding the implications of the signs of h(x) and f', but a complete understanding of how to conclude the existence of exactly one root of g(x) remains elusive.

Contextual Notes

Participants are working under the assumption that f(x) has consecutive roots and are examining the behavior of g(x) in the intervals defined by these roots. There is an emphasis on the need to analyze the signs of the functions and their derivatives to draw conclusions about the roots.

tysonk
Messages
33
Reaction score
0
If someone could guide me as to how I can approach this that would be appreciated. Suppose f(x) and g(x) have continuous first derivatives on R and that
f(x) g'(x) - g(x) f'(x) does not equal 0. Prove that between two consecutive roots of f(x) there is exactly one root of g(x).
 
Physics news on Phys.org
Let

h(x) = f(x) g'(x) - g(x) f'(x) \neq 0

and let x_1,x_2 be consecutive roots of f(x). Consider h(x_1),h(x_2) and draw conclusions about the relative signs of g(x) and f'(x). You'll be able to determine something about the behavior of g(x) on the interval (x_1,x_2).
 
Thanks for the reply.
So I concluded
h(x1) = -g(x1) f'(x1) =! 0
h(x2) = -g(x2) f'(x2) =! 0
This tells us that the derivative at the point of the root for f(x) must be either positive/negative. Also g(x) is not zero meaning there is no root for g(x) a those two points. However, I'm still having trouble understanding/concluding how there can be exactly one root for g(x) between the consecutive roots of f(x).
 
I would choose a definite sign for h(x), then you'll have to make similar choices for f' at the roots. This will lead conclusions for the sign of g at the roots. Having no root for f between x_1 and x_2 should put strong requirements on g and g'. You might also want to note that there will be a point between x_1 and x_2, where f' vanishes. That will lead to some more information on g.
 
Thanks!
 

Similar threads

Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
Replies
9
Views
2K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K