I guess the specification boils down to: Find a set of intervals such that the time series is approximately linear, within a tolerance Max_R, for each interval.

At the same time, I am willing to bastardize my specification enough so that the time consumption is linear, or at least strictly less than quadratic, in the number of points. I also don't want to use third-party software libraries: This should be a standalone program.

This is my first stab at the problem.

Code:

```
define L_0 = suitable power of 2.
while we have not reached end of array:
Let L = L_0
while error > Max_R:
Find the linear interpolation function for the next L points.
Compute max error = distance from point to curve within this interval.
If error > Max_R, let L := L/2.
end while
The current L points have been fitted; now move to next point.
end while
```

And you are invited to suggest to me a better algorithm!