Fitting points z = f(x,y) to a quadratic surface

If your data are concentrated at a certain point, then you could perhaps chose the tangent plane of your surface. But this depends on what is "close to a point".
  • #1
240
17
Hi! I am aware that standard fitting numerical methods like Levenberg-Marquardt, Gauss-Newton, among others, are able to fit a dataset z = f(x,y) to a quadratic surface of the form z = Ax2 + Bxy + Cy2 + Dx + Ey + F, where A to F are the coefficients.

Is there a simpler method that exists? I'm trying to find something similar to fitting the same z = f(x,y) dataset to a linear plane z = A + Bx + Cy where it only involves solving matrices.
 
Physics news on Phys.org
  • #3
fresh_42 said:
Reads as a case of linear regression: https://en.wikipedia.org/wiki/Linear_regression
Hi, thanks for replying.

Actually I am looking for the method where I can fit the second-degree surface polynomial z= f(x,y) to datasets similar to the Vandermonde matrix method for fitting a simple y = f(x) polynomial.

The reference stated the Gauss-Newton algorithm, which I am trying to evade using. Is there such a method that can fit quadratic surfaces without the need for iterative algorithms?
 
  • #4
You want to fit something curved into something linear, so you first have to find a measure for what is a good fit for you and what is not. The least squared distances come to mind, but which measure you chose, you will have to perform an optimization, and algorithms for this are of course iterative. An algorithm is by nature. What you are looking for is a one step formula. Let us assume for a moment that we have such a formula. How can we determine, whether our approximation is a good one or not? And if not, how can we proceed to a better one? Et voilà: the recursive algorithm is born.

If your data are concentrated at a certain point, then you could perhaps chose the tangent plane of your surface. But this depends on what is "close to a point".
 
  • #5
Polynomial regression z = Ax^2 + Bxy + Cy^2 + Dx + Ey + F, in order to evaluate the coefficients A to F is a LINEAR regression because z is linear with respect to the unknowns A to F . Of course the function $z(x)$ is non-linear, but we are not looking for z or for x since they are given data.
This is a LINEAR REGRESSION for a non-linear function, which is very common and doesn't require iterative calculus.

About the fitting of quadratic surface to data without iterative methods : They are a lot of littérature on the subject. For example, see
https://fr.scribd.com/doc/14819165/Regressions-coniques-quadriques-circulaire-spherique
 
  • Like
Likes WWGD and Stephen Tashi
  • #6
maistral said:
Hi! I am aware that standard fitting numerical methods like Levenberg-Marquardt, Gauss-Newton, among others, are able to fit a dataset z = f(x,y) to a quadratic surface of the form z = Ax2 + Bxy + Cy2 + Dx + Ey + F, where A to F are the coefficients.

To speak of "fitting" a function to a data set does not describe a specific mathematical process. Statistically, a person who speaks of "fitting" z = f(x,y) = Ax^2 + Bxy + Cy^2 + Dx + Ey + F to a dataset is most often seeking the function ##f## that minimizes
##g(A,B,C,...) = \sum_{i=1}^n (z_i - (Ax_i^2 + Bx_iy_i + Cy_i^2 + Dx_i + Ey_i + F)\ )^2##

Is that the type of fitting you wish to do?

As @JJacquelin said, finding the extrema of ##g## can be done by a solving a system of linear equations. For example ##\frac{\partial g}{\partial B} = 0## is a linear equation in variables ##A,B,C,D,E,F##. The linear equations can be solved by matrix methods.

If you have experimental data ## (x_i,y_i,z_i) ## that has measurement errors in ##z_i## as well as in ##x_i,y_i##, you might be speaking of "Total least squares" fitting. I don't know much about that topic. ( The current Wikipedia article https://en.wikipedia.org/wiki/Total_least_squares looks specific and comprehensive, but I haven't tried to figure it out.)
 
  • Like
Likes WWGD
  • #7
maistral said:
Actually I am looking for the method where I can fit the second-degree surface polynomial z= f(x,y) to datasets similar to the Vandermonde matrix method for fitting a simple y = f(x) polynomial.
As stated previously, you are looking for a linear regression, specifically the kind known as a polynomial regression. See here: https://en.wikipedia.org/wiki/Polynomial_regression (I wrote part of that article) -- it's a matrix solution for least-squares fit of any data set of n values to a polynomial of any order less than n. The lowest order polynomial (fitting to a straight line) is the simplest case of this solution.
 

Suggested for: Fitting points z = f(x,y) to a quadratic surface

Replies
10
Views
465
Replies
3
Views
767
Replies
1
Views
626
Replies
4
Views
2K
Replies
1
Views
742
Back
Top