What Conditions Ensure Convergence of Sequences?

  • Thread starter Thread starter DespotDespond
  • Start date Start date
  • Tags Tags
    Convergence
DespotDespond
Messages
3
Reaction score
0
Hi,

I have a basic question about convergence.

I have two sequences, x1, x2, ... and y1, y2, ..., where yn = f(xn) for some function f : ℝN → ℝ.

I have shown that the sequence, y1, y2, ... converges. What conditions do I need on the function, f, to ensure that the sequence x1, x2, ... also converges?

Thanks in advance
 
Physics news on Phys.org
You need the function f to be continuous.
 
In fact this is an equivalent definition of being continuous: f is continuous if and only if whenever x_n \to x, f(x_n) \to f(x).
 
But the poster is asking the converse. He's asking if ##f(x_n)\rightarrow x## implies ##x_n\rightarrow x##. So continuity is irrelevant here.
 
Indeed, R136a1 is correct. I want the convergence of the xn → x given the convergence of f(xn) → f(x).

In this case, continuity of f is not enough. For example, f(x) = x2 is continuous, but if we define the two series as follows

x1 = √2, x2 = -√2, x3 = √2, x4 = -√2, ...

and

y1 = 2, y2 = 2, y3 = 2, y4 = 2, ...

then yn is convergent and yn = f(xn), but xn is not convergent.

If the function, f, were invertible and the inverse was continuous, then that would be enough. Right?

The problem is that in my case I don't think that f is invertible.

I don't necessarily need the result to hold globally, i.e. for all x. A local result might suffice.

Any syggestions would be appreciated.
 
DespotDespond said:
Indeed, R136a1 is correct. I want the convergence of the xn → x given the convergence of f(xn) → f(x).

In this case, continuity of f is not enough. For example, f(x) = x2 is continuous, but if we define the two series as follows

x1 = √2, x2 = -√2, x3 = √2, x4 = -√2, ...

and

y1 = 2, y2 = 2, y3 = 2, y4 = 2, ...

then yn is convergent and yn = f(xn), but xn is not convergent.

If the function, f, were invertible and the inverse was continuous, then that would be enough. Right?

The problem is that in my case I don't think that f is invertible.

I don't necessarily need the result to hold globally, i.e. for all x. A local result might suffice.

Any syggestions would be appreciated.

You don't even need fully invertible. Just a continuous left-inverse suffices.

I kind of doubt there is a simple condition. Could you perhaps give some more information about your specific situation?
 
Hi,

Thanks for the response.

The problem is related to an algorithm that I've constructed for optimising Markov decision processes. I'm trying to prove the global convergence of the algorithm. I don't want to go into any unnecessary detail, so I will try to explain as briefly as I can.

I have an objective function, f, of which the x's are the parameters I wish to optimise.

My algorithm generates a series of parameter vectors, x1, x2, ...

I've managed to show that the objective is strictly monotonically increasing with respect to this series, i.e. that

f(x1) < f(x2) < ...

I know that f is bounded from above, so I used the monotone convergence theorem to conclude that the series

f(x1), f(x2), ...

converges.

Generally, f is not injective and so doesn't have an inverse.
 
Thanks for your response. But what I really wanted to know is whether ##f## has some special properties that we could use. We know it's not injective, but is there perhaps something else? Obviously, if ##f## can be any continuous function, then what you're trying to do is false. So there must be something special going on.
 
Back
Top