# A Simple Convergence Question

1. Nov 27, 2013

### DespotDespond

Hi,

I have a basic question about convergence.

I have two sequences, x1, x2, ... and y1, y2, ..., where yn = f(xn) for some function f : ℝN → ℝ.

I have shown that the sequence, y1, y2, .... converges. What conditions do I need on the function, f, to ensure that the sequence x1, x2, ... also converges?

2. Nov 27, 2013

### lurflurf

You need the function f to be continuous.

3. Nov 27, 2013

### Office_Shredder

Staff Emeritus
In fact this is an equivalent definition of being continuous: f is continuous if and only if whenever $x_n \to x$, $f(x_n) \to f(x)$.

4. Nov 27, 2013

### R136a1

But the poster is asking the converse. He's asking if $f(x_n)\rightarrow x$ implies $x_n\rightarrow x$. So continuity is irrelevant here.

5. Nov 28, 2013

### DespotDespond

Indeed, R136a1 is correct. I want the convergence of the xn → x given the convergence of f(xn) → f(x).

In this case, continuity of f is not enough. For example, f(x) = x2 is continuous, but if we define the two series as follows

x1 = √2, x2 = -√2, x3 = √2, x4 = -√2, ...

and

y1 = 2, y2 = 2, y3 = 2, y4 = 2, ...

then yn is convergent and yn = f(xn), but xn is not convergent.

If the function, f, were invertible and the inverse was continuous, then that would be enough. Right?

The problem is that in my case I don't think that f is invertible.

I don't necessarily need the result to hold globally, i.e. for all x. A local result might suffice.

Any syggestions would be appreciated.

6. Nov 28, 2013

### R136a1

You don't even need fully invertible. Just a continuous left-inverse suffices.

7. Nov 28, 2013

### DespotDespond

Hi,

Thanks for the response.

The problem is related to an algorithm that I've constructed for optimising Markov decision processes. I'm trying to prove the global convergence of the algorithm. I don't want to go into any unnecessary detail, so I will try to explain as briefly as I can.

I have an objective function, f, of which the x's are the parameters I wish to optimise.

My algorithm generates a series of parameter vectors, x1, x2, ...

I've managed to show that the objective is strictly monotonically increasing with respect to this series, i.e. that

f(x1) < f(x2) < ...

I know that f is bounded from above, so I used the monotone convergence theorem to conclude that the series

f(x1), f(x2), ...

converges.

Generally, f is not injective and so doesn't have an inverse.

8. Nov 28, 2013

### R136a1

Thanks for your response. But what I really wanted to know is whether $f$ has some special properties that we could use. We know it's not injective, but is there perhaps something else? Obviously, if $f$ can be any continuous function, then what you're trying to do is false. So there must be something special going on.