# 3D Least Squares Fit

## Main Question or Discussion Point

Hello,

I am trying to write an algorithm to calculate the Least Squares Fit Line of a 3D data set. After doing some research and using Google, I came across this document, http://www.udel.edu/HNES/HESC427/Sphere%20Fitting/LeastSquares.pdf [Broken] (section 2 in page 8) that explains the algorithm for
It uses something from Linear Algebra I have never seen called Singular Value Decomposition (SVD) to find the direction cosines of the line of best fit. What is SVD? What is a direction cosine? The literal angle between the x,y,z axes and the line?

For simplicity's sake, I'm starting with the points (0.5, 1, 2) ; (1, 2, 6) ; (2, 4, 7). So the A matrix, as denoted by the document is (skipping the mean and subtractions)
$$A = \left \begin{array} {ccc} [-1.6667 & -1.1667 & -2.8333 \\ -2.0000 & -1.0000 & 3.0000 \\ -2.3333 & -0.3333 & 2.6667 \end{array} \right]$$

and the SVD of A is
$$SVD(A) = \left \begin{array} {ccc} [6.1816 \\ 0.7884 \\ 0.0000 \end{array} \right]$$
but the document says "This matrix A is solved by singular value decomposition. The smallest singular value
of A is selected from the matrix and the corresponding singular vector is chosen which
the direction cosines (a, b, c)" What does that mean?

Any help will greatly be appreciated. Note: I am working in MATLAB R2009a

Last edited by a moderator:

hotvette
Homework Helper
The document in the link explains how to use SVD. It's just a method to factor a matrix A into a product of three matrices A = USVT where U and V are orthogonal matrices and S is diagonal. It is useful in solving the least squares normal equation ATAx= ATb for x by avoiding the matrix multiplications of ATA and ATb. The steps are listed in the document. In the end, you can solve for x as follows:

x = VS-1UTb

QR is another matrix factorization that also allows least squares problems to be readily solved. In this case, A = QR, where Q is orthogonal and R is upper triangular. The solution to the least squares problem then becomes:

Rx = QTb

which is easy to solve for x because R is triangular. The following link explains the use of both SVD and QR for solving least squares problems:

http://en.wikipedia.org/wiki/Linear_least_squares

Besides SVD and QR, you can also use standard Gauss Elimination, LU decomposition (based on Gauss Elimination), or Cholesky decomposition (because ATA is symmetric).

http://en.wikipedia.org/wiki/Matrix_decomposition

Last edited:
Hello,

in the paper referenced below, the exact analytical solution is developed in two cases :
- Least Squares Fitting to a straight line in 3d (orthognal distances between each point and the line)
- Least Squares Fitting to a plane in 3d (orthogonal distances between each point and the plane)
The method isn't iterative ( definitive result is directly achieved in only one run of computation)
A compendium of formulas is provided for practical use page 7 (case of fitting to a straight line) and page 18 (case of fitting to a plane)
Numerical examples are provided for tests.