Plane Fitting with Linear Least Squares

In summary: X = -a0*xi + a1*yi + a2*zi + a3*1After regressing the function, you should have the following:X = a0*xi + a1*yi + a2*zi + a3*1In summary, the cloud of points creates a plane that passes through the origin in \mathbf{R}^3. You can solve a system of homogeneous linear equations to find the eigenvalues and vectors that will pass through the origin. Another approach is to use a LU decomposition and ignore the eigenvalues/vectors.
  • #1
mnb96
715
5
Hello,
I am trying to figure out how to fit a plane passing through the origin in [tex]\mathbf{R}^3[/tex], given a cloud of N points.
The points are vectors of the form [tex](x_1^{(k)}, x_2^{(k)}, x_3^{(k)})[/tex] , where k stands for the k-th point. I want to minimize the sum of squared distances point-plane.

What I came out with was the following:
- solve a 3x3 homogeneous linear system AX=0 in which the (i,j) element of A is:

[tex]\sum_{k=1}^N x_i^{(k)} x_j^{(k)}[/tex]

Now I have two questions:

1) I have read somewhere that this turns out to be an eigenvalues problem. Basically I need to find the eigen-values/vectors in order to solve the system. Why?

2) I found another method which instead builds a rectangular Nx4 matrix in which, in the first column there are all the x's of the points, in the second column all the y's, in the third all the z's, and in the fourth all 1's. Then they compute the SVD and extract (I think) the last column of the rightmost output matrix. Why does that work? What's the difference?
 
Physics news on Phys.org
  • #2
It sounds like you are describing a multiple linear regression problem.

fi = a1xi + a2yi + a3zi

Where xi, yi, and zi are the points and fi is the observed function values.

To solve this, you can create a matrix Z with the following values:

x1 y1 z1
x2 y2 z2
x3 y3 z3
.
.
.
xn yn zn

You can also create a vector Y with the following values:

f1
f2
f3
.
.
.
fn

After creating the Z matrix and Y column vector, the regressed A vector can be found by solving:

ZTZA = ZTY

I have a longer post on this, with a derivation of that equation, and source code that I released to the public domain at http://www.trentfguidry.net/post/2009/07/19/Linear-and-Multiple-Linear-Regression-in-C.aspx"
 
Last edited by a moderator:
  • #3
As for the eigenvalues/eigenvectors, I believe that you have to do that for an SVD decomposition, which is a way of solving the above matrix equation.

Another approach to solving the above equation is to use a different approach, such as an LU decomposition. The potential problem with an LU decomposition is that a singular matrix could result. If that occurs, then the problem cannot be solved using an LU decomposition and something else, like an SVD decomposition, would have to be used.

Fortunately, there are a lot of problems where singular matrices are a rare occurrence and the simpler and faster LU decomposition approach works. LU decomposition does not require eigenvalues/eigenvectors.

For part 2, you describe a Z matrix with ones in it. That Z matrix has constants that can offset it from the origin. For that case, you are regressing the function shown below.

fi = a0 + a1xi + a2yi + a3zi

If you want to force it through the origin, get rid of the column with all ones in it.
 

1. What is "Plane Fitting with Linear Least Squares"?

"Plane Fitting with Linear Least Squares" is a mathematical method used to find the best-fit plane that describes a set of data points in three-dimensional space. It is a type of linear regression that minimizes the sum of squared errors between the data points and the plane, making it a popular tool for data analysis and modeling.

2. How does Linear Least Squares differ from other regression methods?

Linear Least Squares is a specific type of regression that is used when the relationship between the variables is linear. Unlike other regression methods, it focuses on minimizing the vertical distance between the data points and the regression line, rather than the horizontal distance.

3. What are the applications of Plane Fitting with Linear Least Squares?

Plane Fitting with Linear Least Squares has a wide range of applications in fields such as physics, engineering, and data analysis. It can be used to model and analyze data in three-dimensional space, such as predicting future values or identifying trends in the data.

4. How does the algorithm for Linear Least Squares work?

The algorithm for Linear Least Squares involves finding the coefficients of the plane equation that minimizes the sum of squared errors between the data points and the plane. This is done by solving a system of linear equations using matrix operations, which can be done efficiently using computer algorithms.

5. What are the limitations of Plane Fitting with Linear Least Squares?

Plane Fitting with Linear Least Squares assumes that the data points are normally distributed and that the relationship between the variables is linear. This means that it may not be suitable for data sets with non-linear relationships or outliers. Additionally, it does not take into account any potential errors in the data, which can affect the accuracy of the results.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
873
  • Linear and Abstract Algebra
Replies
14
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
896
  • Linear and Abstract Algebra
Replies
17
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
27
Views
1K
Back
Top