Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Inverse linear least squares?

  1. Feb 9, 2008 #1
    Hi,

    Forgive me if the subject of this post is not accurate, I'm not quite sure what the correct terminology would be for what I'm trying to figure out.

    Currently I am using linear least squares via SVD to find the coefficients of a ten term polynomial, say f. This model allows me to predict some output y given some input x. Thats straightforward.

    What I would like to be able to do is turn this around and find a function g where g would take an input x, and output some y' such that f( y' ) = x.

    Is this familiar to anyone? I'm totally stumped.

    xactmetric
     
  2. jcsd
  3. Feb 9, 2008 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Why can't you do that now? In linear least squares, you are given a list of points, [itex]{(x_1, y_1), (x_2, y_2), \cdot\cdot\cdot, (x_n, y_n)}[/itex] and you construct an equation y= ax+ b whose graph passes close (in the least squares sense) to each of those points. Given any other x, the "predicted" y value is ax+ b. But linear equations are easily solved. Given any y, the "predicted" x value is just (y- b)/a. All the work has already been done in using linear least squares to find a and b.
     
  4. Feb 9, 2008 #3
    Yes but my equation is nonlinear. Let me try and explain better.

    Originally I start with two lists of points. Lets put them all on the x-axis for simplicity. The first list I will call actual coordinates. The second list I will call deviated coordinates. The deviated coordinates are simply the actual coordinates plus some offset. My prediction function f just predicts the deviation of an inputted actual coordinate.
    So if I input any actual coordinate x into f, f will give me a predicted deviation. So my predicted deviated point is x + f(x).

    Now say I know in advanced a point x. I need to find a way to determine a point x' such that f( x' ) gives me the deviation I need such that x' + f( x' ) = x.

    Does this make sense? Or am I missing something totally obvious?

    xactmetric
     
  5. Feb 9, 2008 #4
    I don't think you are using the right terminology. I would think for inverse least squares you might look at something like this:

    http://signals.auditblogs.com/2007/07/05/multivariate-calibration/

    Anyway, so basically what it sounds like is you fit a polynomial to a lower order polynomial using least squares. You have two choices. You can either find the roots of this lower order polynomial, or instead you can do a new least squares on the inverse function.
     
  6. Feb 9, 2008 #5
    Thanks, but can you please elaborate a bit. I'm not following you.

    xactmetric
     
  7. Feb 9, 2008 #6
    Can you first try to write down the exact problem you are trying to solve clear.
     
  8. Feb 9, 2008 #7
    The problem is exactly as described in my second post. If you can say whats not clear, I'll try and explain a bit more.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Inverse linear least squares?
  1. Linear least squares (Replies: 2)

Loading...