Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

3D Least Squares Fit

  1. Mar 4, 2010 #1

    I am trying to write an algorithm to calculate the Least Squares Fit Line of a 3D data set. After doing some research and using Google, I came across this document, http://www.udel.edu/HNES/HESC427/Sphere%20Fitting/LeastSquares.pdf [Broken] (section 2 in page 8) that explains the algorithm for
    It uses something from Linear Algebra I have never seen called Singular Value Decomposition (SVD) to find the direction cosines of the line of best fit. What is SVD? What is a direction cosine? The literal angle between the x,y,z axes and the line?

    For simplicity's sake, I'm starting with the points (0.5, 1, 2) ; (1, 2, 6) ; (2, 4, 7). So the A matrix, as denoted by the document is (skipping the mean and subtractions)
    [tex]A = \left \begin{array} {ccc}
    [-1.6667 & -1.1667 & -2.8333 \\
    -2.0000 & -1.0000 & 3.0000 \\
    -2.3333 & -0.3333 & 2.6667 \end{array} \right][/tex]

    and the SVD of A is
    [tex]SVD(A) = \left \begin{array} {ccc}
    [6.1816 \\
    0.7884 \\
    0.0000 \end{array} \right][/tex]
    but the document says "This matrix A is solved by singular value decomposition. The smallest singular value
    of A is selected from the matrix and the corresponding singular vector is chosen which
    the direction cosines (a, b, c)" What does that mean?

    Any help will greatly be appreciated. Note: I am working in MATLAB R2009a

    Thank you in advance!
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Mar 5, 2010 #2


    User Avatar
    Homework Helper

    The document in the link explains how to use SVD. It's just a method to factor a matrix A into a product of three matrices A = USVT where U and V are orthogonal matrices and S is diagonal. It is useful in solving the least squares normal equation ATAx= ATb for x by avoiding the matrix multiplications of ATA and ATb. The steps are listed in the document. In the end, you can solve for x as follows:

    x = VS-1UTb

    QR is another matrix factorization that also allows least squares problems to be readily solved. In this case, A = QR, where Q is orthogonal and R is upper triangular. The solution to the least squares problem then becomes:

    Rx = QTb

    which is easy to solve for x because R is triangular. The following link explains the use of both SVD and QR for solving least squares problems:


    Besides SVD and QR, you can also use standard Gauss Elimination, LU decomposition (based on Gauss Elimination), or Cholesky decomposition (because ATA is symmetric).

    Last edited: Mar 5, 2010
  4. Jun 14, 2010 #3

    in the paper referenced below, the exact analytical solution is developed in two cases :
    - Least Squares Fitting to a straight line in 3d (orthognal distances between each point and the line)
    - Least Squares Fitting to a plane in 3d (orthogonal distances between each point and the plane)
    The method isn't iterative ( definitive result is directly achieved in only one run of computation)
    A compendium of formulas is provided for practical use page 7 (case of fitting to a straight line) and page 18 (case of fitting to a plane)
    Numerical examples are provided for tests.
    Link to the document :
    Then select "Regression & trajectoires 3d."
  5. Sep 16, 2010 #4
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: 3D Least Squares Fit
  1. Least squares fit (Replies: 7)