Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Plane Fitting with Linear Least Squares

  1. Sep 30, 2009 #1
    Hello,
    I am trying to figure out how to fit a plane passing through the origin in [tex]\mathbf{R}^3[/tex], given a cloud of N points.
    The points are vectors of the form [tex](x_1^{(k)}, x_2^{(k)}, x_3^{(k)})[/tex] , where k stands for the k-th point. I want to minimize the sum of squared distances point-plane.

    What I came out with was the following:
    - solve a 3x3 homogeneous linear system AX=0 in which the (i,j) element of A is:

    [tex]\sum_{k=1}^N x_i^{(k)} x_j^{(k)}[/tex]

    Now I have two questions:

    1) I have read somewhere that this turns out to be an eigenvalues problem. Basically I need to find the eigen-values/vectors in order to solve the system. Why?

    2) I found another method which instead builds a rectangular Nx4 matrix in which, in the first column there are all the x's of the points, in the second column all the y's, in the third all the z's, and in the fourth all 1's. Then they compute the SVD and extract (I think) the last column of the rightmost output matrix. Why does that work??? What's the difference?
     
  2. jcsd
  3. Sep 30, 2009 #2
    It sounds like you are describing a multiple linear regression problem.

    fi = a1xi + a2yi + a3zi

    Where xi, yi, and zi are the points and fi is the observed function values.

    To solve this, you can create a matrix Z with the following values:

    x1 y1 z1
    x2 y2 z2
    x3 y3 z3
    .
    .
    .
    xn yn zn

    You can also create a vector Y with the following values:

    f1
    f2
    f3
    .
    .
    .
    fn

    After creating the Z matrix and Y column vector, the regressed A vector can be found by solving:

    ZTZA = ZTY

    I have a longer post on this, with a derivation of that equation, and source code that I released to the public domain at http://www.trentfguidry.net/post/2009/07/19/Linear-and-Multiple-Linear-Regression-in-C.aspx"
     
    Last edited by a moderator: Apr 24, 2017
  4. Sep 30, 2009 #3
    As for the eigenvalues/eigenvectors, I believe that you have to do that for an SVD decomposition, which is a way of solving the above matrix equation.

    Another approach to solving the above equation is to use a different approach, such as an LU decomposition. The potential problem with an LU decomposition is that a singular matrix could result. If that occurs, then the problem cannot be solved using an LU decomposition and something else, like an SVD decomposition, would have to be used.

    Fortunately, there are a lot of problems where singular matrices are a rare occurrence and the simpler and faster LU decomposition approach works. LU decomposition does not require eigenvalues/eigenvectors.

    For part 2, you describe a Z matrix with ones in it. That Z matrix has constants that can offset it from the origin. For that case, you are regressing the function shown below.

    fi = a0 + a1xi + a2yi + a3zi

    If you want to force it through the origin, get rid of the column with all ones in it.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Plane Fitting with Linear Least Squares
  1. Linear least squares (Replies: 2)

Loading...