3D Least Squares Fit

  • Thread starter mjdiaz89
  • Start date
  • #1
11
0

Main Question or Discussion Point

Hello,

I am trying to write an algorithm to calculate the Least Squares Fit Line of a 3D data set. After doing some research and using Google, I came across this document, http://www.udel.edu/HNES/HESC427/Sphere%20Fitting/LeastSquares.pdf [Broken] (section 2 in page 8) that explains the algorithm for
It uses something from Linear Algebra I have never seen called Singular Value Decomposition (SVD) to find the direction cosines of the line of best fit. What is SVD? What is a direction cosine? The literal angle between the x,y,z axes and the line?

For simplicity's sake, I'm starting with the points (0.5, 1, 2) ; (1, 2, 6) ; (2, 4, 7). So the A matrix, as denoted by the document is (skipping the mean and subtractions)
[tex]A = \left \begin{array} {ccc}
[-1.6667 & -1.1667 & -2.8333 \\
-2.0000 & -1.0000 & 3.0000 \\
-2.3333 & -0.3333 & 2.6667 \end{array} \right][/tex]

and the SVD of A is
[tex]SVD(A) = \left \begin{array} {ccc}
[6.1816 \\
0.7884 \\
0.0000 \end{array} \right][/tex]
but the document says "This matrix A is solved by singular value decomposition. The smallest singular value
of A is selected from the matrix and the corresponding singular vector is chosen which
the direction cosines (a, b, c)" What does that mean?

Any help will greatly be appreciated. Note: I am working in MATLAB R2009a

Thank you in advance!
 
Last edited by a moderator:

Answers and Replies

  • #2
hotvette
Homework Helper
996
5
The document in the link explains how to use SVD. It's just a method to factor a matrix A into a product of three matrices A = USVT where U and V are orthogonal matrices and S is diagonal. It is useful in solving the least squares normal equation ATAx= ATb for x by avoiding the matrix multiplications of ATA and ATb. The steps are listed in the document. In the end, you can solve for x as follows:

x = VS-1UTb

QR is another matrix factorization that also allows least squares problems to be readily solved. In this case, A = QR, where Q is orthogonal and R is upper triangular. The solution to the least squares problem then becomes:

Rx = QTb

which is easy to solve for x because R is triangular. The following link explains the use of both SVD and QR for solving least squares problems:

http://en.wikipedia.org/wiki/Linear_least_squares

Besides SVD and QR, you can also use standard Gauss Elimination, LU decomposition (based on Gauss Elimination), or Cholesky decomposition (because ATA is symmetric).

http://en.wikipedia.org/wiki/Matrix_decomposition
 
Last edited:
  • #3
798
34
Hello,

in the paper referenced below, the exact analytical solution is developed in two cases :
- Least Squares Fitting to a straight line in 3d (orthognal distances between each point and the line)
- Least Squares Fitting to a plane in 3d (orthogonal distances between each point and the plane)
The method isn't iterative ( definitive result is directly achieved in only one run of computation)
A compendium of formulas is provided for practical use page 7 (case of fitting to a straight line) and page 18 (case of fitting to a plane)
Numerical examples are provided for tests.
Link to the document :
http://www.scribd.com/people/documents/10794575-jjacquelin
Then select "Regression & trajectoires 3d."
 

Related Threads on 3D Least Squares Fit

  • Last Post
Replies
7
Views
2K
  • Last Post
Replies
2
Views
1K
  • Last Post
Replies
2
Views
1K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
4
Views
5K
Replies
19
Views
3K
Replies
10
Views
2K
Replies
2
Views
1K
  • Last Post
Replies
5
Views
797
Top