# Approximation of Functions using the Sign Function

1. Mar 25, 2012

### ╔(σ_σ)╝

1. The problem statement, all variables and given/known data

Prove that any function $f(x)$ can be approximated to any accuracy by a linear combination of sign functions as:

$f(x) ≈f(x_{0})+ \sum{[f(x_{i+1})-f(x_{i})]} \frac{1+ sgn(x -x_i)}{2}$

2. Relevant equations

3. The attempt at a solution

Looks like taylors theorem with a forward difference replaced with the derievative. It seems like the function only accepts sequential values approaching x from the left. That's about it. Anyone has any ideas?

Last edited: Mar 25, 2012
2. Mar 26, 2012

### sunjin09

Doesn't look like Taylor series to me, nothing have been said about the difference between x_i and x_{i+1}. All I can see is that if {x_i} converges to x from the left, AND if f is left continuous, then the series converges to f(x). Not sure if this is necessary condition.