Least squares solution to simultaneous equations

mgb_phys
Science Advisor
Homework Helper
Messages
7,901
Reaction score
15
I am trying to fit a transformation from one set of coordiantes to another.
x' = R + Px + Qy
y' = S - Qx + Py

Where P,Q,R,S are constants, P = scale*cos(rotation). Q=scale*sin(rotation)

There is a well known 'by hand' formula for fitting P,Q,R,S to a set of corresponding points.
But I need to have an error estimate on the fit - so I need a least squares solution.

I'm having trouble working out how to do this for first principles for data sets with x and y in them.
Can anyone point to an example/tutorial/code sample of how to do this ?
 
Mathematics news on Phys.org
Why don't you try to minimize the sum over the distances of (x',y') to their nearest neighbors (x,y) (there are algorithms for finding those)
 
I was thinking along similar lines. If y' = f(x,y,R,P,Q) and x' = g(x,y,S,P,Q) then one could minimize:

1/2 sum [f - y']^2 + 1/2 sum [g - x']^2
 
hotvette said:
I was thinking along similar lines. If y' = f(x,y,R,P,Q) and x' = g(x,y,S,P,Q) then one could minimize:

1/2 sum [f - y']^2 + 1/2 sum [g - x']^2

Seems like the right approach to me. This could be done in Excel, using the Solver add-in to minimize the sum.

The 1/2 factors aren't necessary.
 
I found http://mathworld.wolfram.com/LeastSquaresFitting.htm
Interstingly the 100year old 'by hand' instructions I had from an old Army surveying manual is almost exactly the same algorithm
 
Last edited by a moderator:
Redbelly98 said:
This could be done in Excel, using the Solver add-in to minimize the sum

Maybe Excel Solver could do it, but it would be much more fun to write out the equations and solve them. :smile:
 
hotvette said:
Maybe Excel Solver could do it, but it would be much more fun to write out the equations and solve them. :smile:

If one really wants to do it that way, start with hotvette's equation:

hotvette said:
I was thinking along similar lines. If y' = f(x,y,R,P,Q) and x' = g(x,y,S,P,Q) then one could minimize:

1/2 sum [f - y']^2 + 1/2 sum [g - x']^2

To minimize the sum, take partial derivitives w.r.t. P, Q, R, S and set each expression equal to zero. That gives 4 linear equations in 4 unknowns to be solved.

Writing out the sum in full:

χ² = (1/2)∑[(Px + Qy + R - x')2 + (-Qx + Py + S - y')2]


Next, set ∂χ² / ∂P = 0:

χ² / ∂P = ∑[(Px + Qy + R - x')x + (-Qx + Py + S - y')y]
= ∑[(Px² + Qxy + Rx - x'x) + (-Qxy + Py² + Sy - y'y)]

= P[∑x² + ∑y²] + R∑x + S∑y - ∑x'x - ∑y'y = 0


And similarly for ∂χ²/∂Q, ∂χ²/∂R, and ∂χ²/∂S.
 
Not sure how to get r2, or errors in the parameters though.
 
  • #10
Continuing the partial differentiation, I get the following:

<br /> \begin{bmatrix}<br /> \sum(x^2+y^2) &amp; 0 &amp; \sum x &amp; \sum y \\ <br /> 0 &amp; \sum(x^2+y^2) &amp; \sum y &amp; -\sum x \\ <br /> \sum x &amp; \sum y &amp; m &amp; 0 \\<br /> \sum y &amp; -\sum x &amp; 0 &amp; m <br /> \end{bmatrix}<br /> \begin{bmatrix}P \\ Q \\ R \\ S \end{bmatrix}<br /> = \begin{bmatrix} \sum (y&#039;y + x&#039;x) \\ \sum (x&#039;y - y&#039;x) \\ \sum x&#039; \\ \sum y&#039; \end{bmatrix}<br />

Re the residual, I think it is just r^2 = \sum (S-Qx+Py-y&#039;)^2 + \sum (R + Px + Qy - x&#039;)^2 which is equivalent to:

r^2 = (Az-y&#039;)^T (Az-y&#039;) + (Bz-x&#039;)^T (Bz-x&#039;)

where:

<br /> A = \begin{bmatrix} <br /> y_1 &amp; -x_1 &amp; 0 &amp; 1 \\ y_2 &amp; -x_2 &amp; 0 &amp; 1 \\ \vdots &amp; \vdots &amp; \vdots &amp; \vdots \\ y_m &amp; -x_m &amp; 0 &amp; 1<br /> \end{bmatrix}<br /> \quad<br /> B=\begin{bmatrix}<br /> x_1 &amp; y_1 &amp; 1 &amp; 0 \\ x_2 &amp; y_2 &amp; 1 &amp; 0 \\ \vdots &amp; \vdots &amp; \vdots &amp; \vdots \\ x_m &amp; y_m &amp; 1 &amp; 0 <br /> \end{bmatrix}<br /> \quad<br /> z = \begin{bmatrix}P \\ Q \\ R \\ S \end{bmatrix}<br /> \quad<br /> y&#039; = \begin{bmatrix} y_1&#039; \\ y_2&#039; \\ \vdots \\ y_m&#039; \end{bmatrix}<br /> \quad<br /> x&#039; = \begin{bmatrix} x_1&#039; \\ x_2&#039; \\ \vdots \\ x_m&#039; \end{bmatrix}<br />
 
  • #11
hotvette said:
Re the residual, I think it is just r^2 = \sum (S-Qx+Py-y&#039;)^2 + \sum (R + Px + Qy - x&#039;)^2 which is equivalent to:

r^2 = (Az-y&#039;)^T (Az-y&#039;) + (Bz-x&#039;)^T (Bz-x&#039;)

I was thinking of the correlation coefficient r, not the residual sum of squares (often denoted by RSS).
 
  • #12
I've been looking at it and the problem with getting errors directly from the equations is that the coefficients are so orthogonal. A small error in the sin./cos terms is much more significant than in the origin.

What I did was find the fit and then work out the mismatch for each of the known set of points and then use the statistics of that.
 
  • #13
mgb_phys said:
What I did was find the fit and then work out the mismatch for each of the known set of points and then use the statistics of that.

Makes sense. Does the mismatch look reasonably Gaussian?
 
  • #14
Redbelly98 said:
Makes sense. Does the mismatch look reasonably Gaussian?
Too few points to tell.
In reality the error is likely to be due to an outlier where one match is just completely wrong.
Best approach is some iterative removal of outliers - but the specs call for a statistical measure of accuracy.
 

Similar threads

Back
Top