Can linear least squares be used for inverse function approximation?

xactmetric
Messages
4
Reaction score
0
Hi,

Forgive me if the subject of this post is not accurate, I'm not quite sure what the correct terminology would be for what I'm trying to figure out.

Currently I am using linear least squares via SVD to find the coefficients of a ten term polynomial, say f. This model allows me to predict some output y given some input x. Thats straightforward.

What I would like to be able to do is turn this around and find a function g where g would take an input x, and output some y' such that f( y' ) = x.

Is this familiar to anyone? I'm totally stumped.

xactmetric
 
Physics news on Phys.org
Why can't you do that now? In linear least squares, you are given a list of points, {(x_1, y_1), (x_2, y_2), \cdot\cdot\cdot, (x_n, y_n)} and you construct an equation y= ax+ b whose graph passes close (in the least squares sense) to each of those points. Given any other x, the "predicted" y value is ax+ b. But linear equations are easily solved. Given any y, the "predicted" x value is just (y- b)/a. All the work has already been done in using linear least squares to find a and b.
 
Yes but my equation is nonlinear. Let me try and explain better.

Originally I start with two lists of points. Let's put them all on the x-axis for simplicity. The first list I will call actual coordinates. The second list I will call deviated coordinates. The deviated coordinates are simply the actual coordinates plus some offset. My prediction function f just predicts the deviation of an inputted actual coordinate.
So if I input any actual coordinate x into f, f will give me a predicted deviation. So my predicted deviated point is x + f(x).

Now say I know in advanced a point x. I need to find a way to determine a point x' such that f( x' ) gives me the deviation I need such that x' + f( x' ) = x.

Does this make sense? Or am I missing something totally obvious?

xactmetric
 
I don't think you are using the right terminology. I would think for inverse least squares you might look at something like this:

http://signals.auditblogs.com/2007/07/05/multivariate-calibration/

Anyway, so basically what it sounds like is you fit a polynomial to a lower order polynomial using least squares. You have two choices. You can either find the roots of this lower order polynomial, or instead you can do a new least squares on the inverse function.
 
Thanks, but can you please elaborate a bit. I'm not following you.

xactmetric
 
xactmetric said:
Thanks, but can you please elaborate a bit. I'm not following you.

xactmetric

Can you first try to write down the exact problem you are trying to solve clear.
 
The problem is exactly as described in my second post. If you can say what's not clear, I'll try and explain a bit more.
 
Back
Top