- #1

- 28

- 1

## Main Question or Discussion Point

I have a time series of data that I want to interpolate using a piecewise linear function. The lengths of the linear pieces should be adaptive such that the maximum error of the points in the interval does not exceed a predetermined value Max_R.

I guess the specification boils down to: Find a set of intervals such that the time series is approximately linear, within a tolerance Max_R, for each interval.

At the same time, I am willing to bastardize my specification enough so that the time consumption is linear, or at least strictly less than quadratic, in the number of points. I also don't want to use third-party software libraries: This should be a standalone program.

This is my first stab at the problem.

My algorithm has the drawback that it produces not the optimal intervals, but only intervals of certain lengths 2^n, with a maximum interval length of L_0. For the time consumption I think a worst-case scenario is something like N*L_0*log(L_0), but you are welcome to correct me if I am wrong.

And you are invited to suggest to me a better algorithm!

I guess the specification boils down to: Find a set of intervals such that the time series is approximately linear, within a tolerance Max_R, for each interval.

At the same time, I am willing to bastardize my specification enough so that the time consumption is linear, or at least strictly less than quadratic, in the number of points. I also don't want to use third-party software libraries: This should be a standalone program.

This is my first stab at the problem.

Code:

```
define L_0 = suitable power of 2.
while we have not reached end of array:
Let L = L_0
while error > Max_R:
Find the linear interpolation function for the next L points.
Compute max error = distance from point to curve within this interval.
If error > Max_R, let L := L/2.
end while
The current L points have been fitted; now move to next point.
end while
```

And you are invited to suggest to me a better algorithm!