Calculating 3D Least Squares Fit with SVD in MATLAB

Click For Summary

Discussion Overview

The discussion revolves around calculating the Least Squares Fit Line of a 3D data set using Singular Value Decomposition (SVD) in MATLAB. Participants explore the theoretical underpinnings of SVD, its application in least squares fitting, and alternative methods for solving such problems.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Exploratory

Main Points Raised

  • One participant seeks clarification on SVD and direction cosines in the context of least squares fitting for a 3D dataset.
  • Another participant explains SVD as a method to factor a matrix into orthogonal matrices and a diagonal matrix, emphasizing its utility in solving least squares problems without matrix multiplications.
  • Alternative methods such as QR factorization, Gauss Elimination, LU decomposition, and Cholesky decomposition are mentioned as viable approaches for least squares fitting.
  • A third participant references a paper that provides analytical solutions for least squares fitting to both a straight line and a plane in 3D, noting that the method yields definitive results in a single computation.
  • A later reply provides an updated link to the referenced document for further exploration of the topic.

Areas of Agreement / Disagreement

Participants express varying levels of understanding regarding SVD and its application, with some providing explanations and resources while others seek clarification. There is no consensus on the best method for least squares fitting, as multiple approaches are discussed.

Contextual Notes

Some participants reference specific documents and links that contain detailed methodologies and examples, but the discussion does not resolve the complexities of the mathematical steps involved in SVD or the alternative methods mentioned.

mjdiaz89
Messages
10
Reaction score
0
Hello,

I am trying to write an algorithm to calculate the Least Squares Fit Line of a 3D data set. After doing some research and using Google, I came across this document, http://www.udel.edu/HNES/HESC427/Sphere%20Fitting/LeastSquares.pdf (section 2 in page 8) that explains the algorithm for
It uses something from Linear Algebra I have never seen called Singular Value Decomposition (SVD) to find the direction cosines of the line of best fit. What is SVD? What is a direction cosine? The literal angle between the x,y,z axes and the line?

For simplicity's sake, I'm starting with the points (0.5, 1, 2) ; (1, 2, 6) ; (2, 4, 7). So the A matrix, as denoted by the document is (skipping the mean and subtractions)
A = \left \begin{array} {ccc} <br /> [-1.6667 &amp; -1.1667 &amp; -2.8333 \\<br /> -2.0000 &amp; -1.0000 &amp; 3.0000 \\<br /> -2.3333 &amp; -0.3333 &amp; 2.6667 \end{array} \right]

and the SVD of A is
SVD(A) = \left \begin{array} {ccc} <br /> [6.1816 \\<br /> 0.7884 \\<br /> 0.0000 \end{array} \right]
but the document says "This matrix A is solved by singular value decomposition. The smallest singular value
of A is selected from the matrix and the corresponding singular vector is chosen which
the direction cosines (a, b, c)" What does that mean?

Any help will greatly be appreciated. Note: I am working in MATLAB R2009a

Thank you in advance!
 
Last edited by a moderator:
Physics news on Phys.org
The document in the link explains how to use SVD. It's just a method to factor a matrix A into a product of three matrices A = USVT where U and V are orthogonal matrices and S is diagonal. It is useful in solving the least squares normal equation ATAx= ATb for x by avoiding the matrix multiplications of ATA and ATb. The steps are listed in the document. In the end, you can solve for x as follows:

x = VS-1UTb

QR is another matrix factorization that also allows least squares problems to be readily solved. In this case, A = QR, where Q is orthogonal and R is upper triangular. The solution to the least squares problem then becomes:

Rx = QTb

which is easy to solve for x because R is triangular. The following link explains the use of both SVD and QR for solving least squares problems:

http://en.wikipedia.org/wiki/Linear_least_squares

Besides SVD and QR, you can also use standard Gauss Elimination, LU decomposition (based on Gauss Elimination), or Cholesky decomposition (because ATA is symmetric).

http://en.wikipedia.org/wiki/Matrix_decomposition
 
Last edited:
Hello,

in the paper referenced below, the exact analytical solution is developed in two cases :
- Least Squares Fitting to a straight line in 3d (orthognal distances between each point and the line)
- Least Squares Fitting to a plane in 3d (orthogonal distances between each point and the plane)
The method isn't iterative ( definitive result is directly achieved in only one run of computation)
A compendium of formulas is provided for practical use page 7 (case of fitting to a straight line) and page 18 (case of fitting to a plane)
Numerical examples are provided for tests.
Link to the document :
http://www.scribd.com/people/documents/10794575-jjacquelin
Then select "Regression & trajectoires 3d."
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K