Register to reply

Best piecewise linear interpolation

by EmpaDoc
Tags: interpolation, linear, piecewise
Share this thread:
arildno
#19
Oct4-13, 10:23 AM
Sci Advisor
HW Helper
PF Gold
P: 12,016
" Using least squares almost inevitably means you are going to have to have more segments than are needed by a minimax approach."
In full agreement to that. And?
If you can come up with a minimax approach that is also of the same order of computations as my approach, I'm sure he'll appreciate it.
I certainly will!
arildno
#20
Oct4-13, 02:23 PM
Sci Advisor
HW Helper
PF Gold
P: 12,016
Empadoc:
DH gives very good advice to you if really want to minimize the number of line segments to use, but are willing to increase computation time somewhat.

Thus, you really need to prioritize whether "shortest possible computation time" should override "fewest line segments to be used".
D H
#21
Oct4-13, 02:31 PM
Mentor
P: 15,170
An alternative to least squares is to use orthogonal polynomials, preferably Chebychev polynomials. For most well-behaved functions, a Chebychev polynomial approximation comes very close to that magical function that minimizes the maximum absolute deviation, much closer than does a least squares regression. For this reason, making a Chebychev approximation is typically used as the initial step in a Remes exchange type algorithm. A Chebychev approximation is computationally of the same order as a least squares regression, so it's not an undue burden.
arildno
#22
Oct5-13, 03:21 AM
Sci Advisor
HW Helper
PF Gold
P: 12,016
Quote Quote by D H View Post
An alternative to least squares is to use orthogonal polynomials, preferably Chebychev polynomials. For most well-behaved functions, a Chebychev polynomial approximation comes very close to that magical function that minimizes the maximum absolute deviation, much closer than does a least squares regression. For this reason, making a Chebychev approximation is typically used as the initial step in a Remes exchange type algorithm. A Chebychev approximation is computationally of the same order as a least squares regression, so it's not an undue burden.
So, if I understand correctly:
First, you make a higher order polynomial interpolation of the data points with a nice class of polynomials, and then reappoximate that segmented polynomial with a piecewise linear interpolation? Sounds cool!
EmpaDoc
#23
Oct8-13, 01:15 AM
P: 28
Quote Quote by arildno View Post
You ARE using Lagrange multipliers here on your linear regression technique, aren't you, in order to fuse together your line segments?
Well, now I was actually thinking: "I can get the time consumption down to O(N) if I draw a line that passes close enough to all the points, without optimizing it at all". I donšt think that can be done if I make an actual fit...

The cost-benefit analysis here depends, obviously, on my specific needs. In this case I am in fact making an animation; so no stringency is needed if it just looks good...

At the same time I am scientifically minded enough that I will probably try the more stringent fit that you so helpfully put up, arildno!
arildno
#24
Oct8-13, 06:44 AM
Sci Advisor
HW Helper
PF Gold
P: 12,016
Note the comments from DH:
Caring about the leaste squares mean error as I do in my approach will make for an extremely jagged line, relative to what can be gained through other techniques.
DH says, at one point, that the computational burden need not be much heavier here than a least squares approach, so that might be an alternative to look into.


Register to reply

Related Discussions
Reverse Linear Interpolation? Precalculus Mathematics Homework 6
Non-uniform bi-linear interpolation Linear & Abstract Algebra 2
Linear interpolation Programming & Computer Science 0
Linear Interpolation General Math 0
Linear Interpolation Differential Geometry 0