razmtaz
- 23
- 0
Homework Statement
let f(x) = tan(x)/2. show the iterates of x converge to 0 whenever x is in (-pi/3, pi/3)
(note this is for a course in chaos/fractals)
Homework Equations
A similar problem in the book used mean value theorem which says that for a function continuous on [a,b] and differentiable on (a,b), there exists c such that: f'(c) = (f(b)-f(a))/b-a
The Attempt at a Solution
First of all, tan(-x) = -tan(x) for the given interval so I restrict the problem to [0,pi/3)
also tan(0) = 0 so restrict the interval further to (0,pi/3)
now we want to show that the iterates of the function (ie the sequence {f(x), f(f(x)), f(f(f(x))), ...} ) converges to zero for the given interval
I used the mean value theorem as was done in the textbook in hopes of getting down to tan(x)/2 < x so that I can say f[n+1](x) = f(f[n](x)) = tan(f[n](x))/2 < f[n](x) so that I can then conclude that every iterate of the function is less than the previous one, and since the function is bounded below by 0 (on our interval) I can finally conclude that the iterates converge to 0.
First question: Would you fine people be convinced that:
since tan(x) is strictly increasing on the given interval, and since tan(pi/3)/2 < pi/3 then tan(x)/2 < x for the given interval? If so, I think I can proceed as I've outlined above.
If not, here is how I've proceeded with MVT, on interval (0,x):
f'(c)(b-a) = f(b)-f(a)
-> ((sec^2(c))/2) * (x-0) = tan(x)/2 - tan(0)/2 = tan(x)/2
then on the interval (0,pi/3) I get:
1/2 < sec^2(x)/2 < 2
-> x/2 < x*sec^2(x)/2 = tan(x)/2 < 2x
And now I am stuck because I can't get tan(x)/2 < x
I can, however, get x/4 < tan(x)/4 < x
which, using the method I wanted to above, would give me:
f[n+1](x) = f(f[n](x)) = 1/2 *tan(f[n](x))/2 < f[n](x)
but I am not sure that 1/2 *tan(f[n](x))/2 < f[n](x) is a strong enough statement to conclude that the iterates of tan will always be decreasing.
Any hints, help, insight, comments, corrections, etc... are welcomed.
thanks for you help.
EDIT: Is this (showing f(x) < x) a good method to use when attempting to show that the iterates of an arbitrary function converge to a specific value in a specific interval? Is there a standard way to attempt such a problem?
Last edited: