Uniqueness of Zeros in Functions Related by Derivatives

drawar
Messages
130
Reaction score
0

Homework Statement


Given 2 functions f(x) and g(x) that are differentiable everywhere on R and f′(x) = g(x) and g′(x) = −f(x). Prove that
1. Between any two consecutive zeros of f(x)=0 there is exactly one zero of g(x)=0,
2. Between any two consecutive zeros of g(x)=0 there is exactly one zero of f(x)=0.


Homework Equations





The Attempt at a Solution


I guess the first question has something to do with Rolle's Theorem but the theorem only states that there exists a zero of f'(x)=0 between 2 zeros of f(x), without mentioning about the uniqueness of that zero. Also I have trouble tackling the second question. Any help is appreciated, thanks!
 
Physics news on Phys.org
drawar said:

Homework Statement


Given 2 functions f(x) and g(x) that are differentiable everywhere on R and f′(x) = g(x) and g′(x) = −f(x). Prove that
1. Between any two consecutive zeros of f(x)=0 there is exactly one zero of g(x)=0,
2. Between any two consecutive zeros of g(x)=0 there is exactly one zero of f(x)=0.


Homework Equations





The Attempt at a Solution


I guess the first question has something to do with Rolle's Theorem but the theorem only states that there exists a zero of f'(x)=0 between 2 zeros of f(x), without mentioning about the uniqueness of that zero. Also I have trouble tackling the second question. Any help is appreciated, thanks!

Suppose there were two zeros of f'(x). What does that tell you about g(x)?
 
Dick said:
Suppose there were two zeros of f'(x). What does that tell you about g(x)?

A contradiction! I think I quite get what you said...

Let a and b (a < b) be 2 consecutive zeros of f(x)=0, i.e. f(a)=f(b)=0. By Rolle's Theorem, there exists c \in (a,b) such that f'(c)=0, which means g(c)=0.

Suppose there were 2 zeros of f'(x) between a and b, namely c1 and c2 (a< c1 < c2 < b), then f'(c1)=f'(c2)=0, or equivalently, g(c1)=g(c2)=0. By Rolle's Theorem there exists d \in (a,b) such that g'(d)=0. It follows that -f(d)=0 and thus f(d)=0. This is a contradiction since a and b are 2 consecutive zeros of f(x)=0.
Therefore there is exactly 1 zero of g(x)=0 between 2 consecutive zeros of f(x)=0.
 
drawar said:
A contradiction! I think I quite get what you said...

Let a and b (a < b) be 2 consecutive zeros of f(x)=0, i.e. f(a)=f(b)=0. By Rolle's Theorem, there exists c \in (a,b) such that f'(c)=0, which means g(c)=0.

Suppose there were 2 zeros of f'(x) between a and b, namely c1 and c2 (a< c1 < c2 < b), then f'(c1)=f'(c2)=0, or equivalently, g(c1)=g(c2)=0. By Rolle's Theorem there exists d \in (a,b) such that g'(d)=0. It follows that -f(d)=0 and thus f(d)=0. This is a contradiction since a and b are 2 consecutive zeros of f(x)=0.
Therefore there is exactly 1 zero of g(x)=0 between 2 consecutive zeros of f(x)=0.

Very nice! BTW sin(x) and cos(x) are examples of a pair of functions that have this property.
 
Dick said:
Very nice! BTW sin(x) and cos(x) are examples of a pair of functions that have this property.

Yeah thank you so much!
I wonder if there are any non-periodic functions having this property?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top