Fitting points z = f(x,y) to a quadratic surface

Click For Summary

Discussion Overview

The discussion revolves around methods for fitting a dataset of the form z = f(x,y) to a quadratic surface represented by the equation z = Ax² + Bxy + Cy² + Dx + Ey + F. Participants explore the possibility of simpler, non-iterative methods for this fitting process, contrasting it with standard numerical methods like Levenberg-Marquardt and Gauss-Newton.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants suggest that fitting a quadratic surface can be approached similarly to linear regression, where the coefficients A to F can be evaluated without iterative methods.
  • Others argue that fitting inherently involves optimization, which typically requires iterative algorithms, and question how one would assess the quality of a fit without such methods.
  • A participant mentions that polynomial regression can be treated as linear regression in terms of the coefficients, despite the non-linear nature of the function itself.
  • There is a reference to literature on the subject of fitting quadratic surfaces, indicating that various methods exist, although specifics are not detailed.
  • Some participants clarify that the fitting process involves minimizing a specific function related to the residuals of the data points, which can be approached through solving linear equations.
  • One participant highlights the concept of "Total least squares" fitting, which may be relevant when considering measurement errors in the dataset.

Areas of Agreement / Disagreement

Participants express differing views on the feasibility of non-iterative methods for fitting quadratic surfaces, with some asserting that it is possible while others maintain that optimization typically requires iterative approaches. The discussion remains unresolved regarding the existence of a one-step fitting formula.

Contextual Notes

Participants note that the fitting process may depend on the specific definitions and assumptions made about the data and the fitting criteria, which are not fully explored in the discussion.

maistral
Messages
235
Reaction score
17
Hi! I am aware that standard fitting numerical methods like Levenberg-Marquardt, Gauss-Newton, among others, are able to fit a dataset z = f(x,y) to a quadratic surface of the form z = Ax2 + Bxy + Cy2 + Dx + Ey + F, where A to F are the coefficients.

Is there a simpler method that exists? I'm trying to find something similar to fitting the same z = f(x,y) dataset to a linear plane z = A + Bx + Cy where it only involves solving matrices.
 
Physics news on Phys.org
fresh_42 said:
Reads as a case of linear regression: https://en.wikipedia.org/wiki/Linear_regression
Hi, thanks for replying.

Actually I am looking for the method where I can fit the second-degree surface polynomial z= f(x,y) to datasets similar to the Vandermonde matrix method for fitting a simple y = f(x) polynomial.

The reference stated the Gauss-Newton algorithm, which I am trying to evade using. Is there such a method that can fit quadratic surfaces without the need for iterative algorithms?
 
You want to fit something curved into something linear, so you first have to find a measure for what is a good fit for you and what is not. The least squared distances come to mind, but which measure you chose, you will have to perform an optimization, and algorithms for this are of course iterative. An algorithm is by nature. What you are looking for is a one step formula. Let us assume for a moment that we have such a formula. How can we determine, whether our approximation is a good one or not? And if not, how can we proceed to a better one? Et voilà: the recursive algorithm is born.

If your data are concentrated at a certain point, then you could perhaps chose the tangent plane of your surface. But this depends on what is "close to a point".
 
Polynomial regression z = Ax^2 + Bxy + Cy^2 + Dx + Ey + F, in order to evaluate the coefficients A to F is a LINEAR regression because z is linear with respect to the unknowns A to F . Of course the function $z(x)$ is non-linear, but we are not looking for z or for x since they are given data.
This is a LINEAR REGRESSION for a non-linear function, which is very common and doesn't require iterative calculus.

About the fitting of quadratic surface to data without iterative methods : They are a lot of littérature on the subject. For example, see
https://fr.scribd.com/doc/14819165/Regressions-coniques-quadriques-circulaire-spherique
 
  • Like
Likes   Reactions: WWGD and Stephen Tashi
maistral said:
Hi! I am aware that standard fitting numerical methods like Levenberg-Marquardt, Gauss-Newton, among others, are able to fit a dataset z = f(x,y) to a quadratic surface of the form z = Ax2 + Bxy + Cy2 + Dx + Ey + F, where A to F are the coefficients.

To speak of "fitting" a function to a data set does not describe a specific mathematical process. Statistically, a person who speaks of "fitting" z = f(x,y) = Ax^2 + Bxy + Cy^2 + Dx + Ey + F to a dataset is most often seeking the function ##f## that minimizes
##g(A,B,C,...) = \sum_{i=1}^n (z_i - (Ax_i^2 + Bx_iy_i + Cy_i^2 + Dx_i + Ey_i + F)\ )^2##

Is that the type of fitting you wish to do?

As @JJacquelin said, finding the extrema of ##g## can be done by a solving a system of linear equations. For example ##\frac{\partial g}{\partial B} = 0## is a linear equation in variables ##A,B,C,D,E,F##. The linear equations can be solved by matrix methods.

If you have experimental data ## (x_i,y_i,z_i) ## that has measurement errors in ##z_i## as well as in ##x_i,y_i##, you might be speaking of "Total least squares" fitting. I don't know much about that topic. ( The current Wikipedia article https://en.wikipedia.org/wiki/Total_least_squares looks specific and comprehensive, but I haven't tried to figure it out.)
 
  • Like
Likes   Reactions: WWGD
maistral said:
Actually I am looking for the method where I can fit the second-degree surface polynomial z= f(x,y) to datasets similar to the Vandermonde matrix method for fitting a simple y = f(x) polynomial.
As stated previously, you are looking for a linear regression, specifically the kind known as a polynomial regression. See here: https://en.wikipedia.org/wiki/Polynomial_regression (I wrote part of that article) -- it's a matrix solution for least-squares fit of any data set of n values to a polynomial of any order less than n. The lowest order polynomial (fitting to a straight line) is the simplest case of this solution.
 

Similar threads

Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
11K