- #1

Aerostd

- 18

- 0

## Homework Statement

Hello. I am trying to prove a result that I have been making use of, but never really proved. Consider the recurrence equation

[itex]x(k+1) = 0.5 x(k) + u(k)[/itex],

where u(k) is a bounded sequence. For this problem, assume that u(k) goes to zero. I want to prove that x(k) goes to zero.

## Homework Equations

If I use recursive substitution, I get for any k

[itex]x(k) = 0.5^{k}x(0) + \sum_{i=0}^{k-1}0.5^{i}u(k-i)[/itex]

## The Attempt at a Solution

The first thing that came to my mind was to sandwich x(k) between two sequences that both went to zero. But I'm having some trouble bounding this relation. So the first thing I like to do to get an idea of how to proceed is to consider a simple input u(k) = 1/k. In this case, the recursive relation becomes

[itex]x(k) = 0.5^{k}x(0) + \sum_{i=0}^{k-1}\frac{0.5^{i}}{k-i}[/itex]

The first term is easy to bound. But what can I do with the second term. Can I make use of u_min = min(u(k)) and u_max = max(u(k)) in some special way? Can someone suggest some techniques or theorems that I can use to prove that x(k) goes to zero?