# Convergence of a recurrence equation: x(k+1) = 0.5x(k) + u(k)

1. Mar 25, 2012

### Aerostd

1. The problem statement, all variables and given/known data

Hello. I am trying to prove a result that I have been making use of, but never really proved. Consider the recurrence equation

$x(k+1) = 0.5 x(k) + u(k)$,

where u(k) is a bounded sequence. For this problem, assume that u(k) goes to zero. I want to prove that x(k) goes to zero.

2. Relevant equations

If I use recursive substitution, I get for any k

$x(k) = 0.5^{k}x(0) + \sum_{i=0}^{k-1}0.5^{i}u(k-i)$

3. The attempt at a solution

The first thing that came to my mind was to sandwich x(k) between two sequences that both went to zero. But I'm having some trouble bounding this relation. So the first thing I like to do to get an idea of how to proceed is to consider a simple input u(k) = 1/k. In this case, the recursive relation becomes

$x(k) = 0.5^{k}x(0) + \sum_{i=0}^{k-1}\frac{0.5^{i}}{k-i}$

The first term is easy to bound. But what can I do with the second term. Can I make use of u_min = min(u(k)) and u_max = max(u(k)) in some special way?

Can someone suggest some techniques or theorems that I can use to prove that x(k) goes to zero?

2. Mar 25, 2012

### ╔(σ_σ)╝

Have you tried replacing u(k-i) by M, where M = max(u(k))? M exist because in the problem statement u(k) is bounded.

Then use the sum of a geometric series formula.

3. Mar 26, 2012

### Aerostd

Hi. Thanks a lot for replying.

If I replace 0.5^{i}u(k-i) by M where M is it's max value, then I can get an upper bound on x(k). But this upper bound might blow up as k tends to infinity (for the case where M>1). I won't be able to conclude what happens to x(k) as k tends to infinity.

4. Mar 26, 2012

### ╔(σ_σ)╝

Huh?

Replace u(k-i) with M. You get 0.5^{i}M. The sum of 0.5^{i} is known using the geometric series formula.

Anyway, forget that for now.

What you actually need is to use is the fact that if u(k) -> 0 the there exist a $k_{0}$ such that $|u(k)| < \epsilon$ for all $k > k_{0}$.

It is easy to show that $0.5^{k}$ also tends to zero. So you could probably use the triangle inequality to get an epsilon to show that $|x(k)| < \epsilon_{1,2}$.

Last edited: Mar 26, 2012
5. Mar 26, 2012

### Aerostd

Sorry about that. I understand and can solve it now. Thanks.

6. Mar 26, 2012

### ╔(σ_σ)╝

Can you post your solution attempt ?

7. Mar 26, 2012

### Aerostd

-- Show that x(k) goes to zero in the limit.

Let $\epsilon > 0$. Now, we know that $0.5^{k}$ and $u(k)$ both go to zero, therefore, for all $\epsilon_{1} > 0$, there exists $k_{1}>0$, such that if $k > k_{1}$, then $| 0.5^{k} | < \epsilon_{1}$ and $| u(k) | < \epsilon_{1}$. Now

$x(k) = | 0.5^{k} x(0) + \sum_{i=0}^{k-1} 0.5^{i} u(k-i) |$

$< | \epsilon_{1} x(0) + \sum_{i=0}^{k-1} 0.5^{i} \epsilon_{1} |$ for all $k > k_{1}$

$= | \epsilon_{1} x(0) + \frac{\epsilon_{1}(1-0.5^k)}{0.5} |$

$= | \frac{0.5\epsilon_{1} x(0) + \epsilon_{1}(1-0.5^k)}{0.5} |$

$= \epsilon_{1}| \frac{0.5 x(0) + (1-0.5^k)}{0.5} |$

Choose $k_{1}$ such that

$\epsilon_{1}| \frac{0.5 x(0) + (1-0.5^k)}{0.5} | < \epsilon$

Now, for all $\epsilon > 0$, there exists $k_{1}$ such that $k > k_{1}$ implies
$|x(k)| < \epsilon$

I hope it's right.

Last edited: Mar 26, 2012
8. Mar 26, 2012

### ╔(σ_σ)╝

Edit:

Actually you should probably use $k_{2}$ to establish the final epsilon then use $k_{0}= max(k_1,k_2)$

Last edited: Mar 26, 2012
9. Mar 29, 2012

### Aerostd

Something seemed fishy to me, so I went back to this and found a mistake. I need your help to see if what I have done this time is correct and legal.

I'll start from the top again.

-- Show that $x(k)$ goes to zero in the limit.

Let $ϵ>0$. Now, we know that $0.5k$ and $u(k)$ both go to zero(by assumption), therefore, for all $ϵ_{1}>0$, there exists $k_{1}>0$, such that if $k>k_{1}$, then $|0.5k|<ϵ_{1}$ and $|u(k)|<ϵ_{1}$. Now, let $k > 2k_{1} +1$...

$| x(k) | = | 0.5^{k} x(0) + \sum_{i=0}^{k-1} 0.5^{i} u(k-i)|$
$= | 0.5^{k} x(0) + \sum_{i=0}^{k_{1}} 0.5^{i} u(k-i) + \sum_{i=k_{1}+1}^{k} 0.5^{i}u(k-i) |$

The reason I did this is because in the sums, the u(k-i) goes backwards. So you can go $k-k_{1}$ steps back after which $u(k-i) < \epsilon_{1}$ is no longer true. Therefore

$|x(k)| < | \epsilon_{1} x(0) + \sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1} + \sum_{i=k_{1}+1}^{k} 0.5^{i} u(k-i) |$

Now, over k_{1}+1 to k, u(k-i) has a max value. Let $M$ be that max value. Then

$|x(k)| < | \epsilon_{1} x(0) + \sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1} + \sum_{i=k_{1}+1}^{k} 0.5^{i} M |$

Now, using the geometric series formula, I get

$|x(k)| < | \epsilon_{1} x(0) + \sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1} + \frac{M 0.5^{k_{1}+1}}{0.5} + \frac{M 0.5^{k}}{0.5}|$

$|x(k)| < | \epsilon_{1} x(0) + \sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1} + \frac{M \epsilon_{1}}{0.5} + \frac{M 0.5^{k}}{0.5}|$

Now, for any $\epsilon > 0$ we need to find a $k_{2}$ such that $k > k_{2}$ would give

$| \epsilon_{1} x(0) |+ |\sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1}| + |\frac{M 0.5^{k_{1}+1}}{0.5}| + |\frac{M 0.5^{k}}{0.5}| < \epsilon$

$0.5^{k} < \frac{0.5}{|M|}(\epsilon -| \epsilon_{1} x(0) |- |\sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1}| - |\frac{M \epsilon_{1}}{0.5}|)$

$k = k_{2} > log(\frac{0.5}{|M|}(\epsilon -| \epsilon_{1} x(0) |- |\sum_{i=0}^{k_{1}} 0.5^{i} \epsilon_{1}| - |\frac{M \epsilon_{1}}{0.5}|))/log(0.5)$ (since log(0.5) is negative)

Now, for any $\epsilon > 0$, pick $k = max(2k_{1}+1, k_{2})$. Then

$|x(k+1)| < \epsilon$

If someone could check this and give some feedback, I would appreciate it a lot. Thanks!

Last edited: Mar 29, 2012