Intriguing problem, I've been pondering it for a couple of days. Sorry to say I don’t necessarily have answers but here are some thoughts that might stimulate discussion.
- Not sure why orthogonal distance regression is needed. Ordinary least squares should be able to be used on a problem like this. Nothing wrong with using ODR but the motivation to do so isn't obvious.
- I tried ODRPACK once but couldn’t figure it out either. Didn’t try very hard though. Solving problems using first principles is much more fun and educational (if you have the interest and time) than using canned programs.
- This could be viewed as a linear least squares problem with a single nonlinear constraint (i.e. B^2 = 4AC)
- Developing the governing equations for least squares with equality constraints is actually quite straightforward (even for parametric equations).
- One possible approach (since A is fixed) is to say Bxy+Cy
2+Dx+Ey+F = -Ax
2 and just treat the set of linear equations as a black box least squares problem (i.e. you have a known matrix A with more rows than columns, a known vector b, and an unknown vector x) and solve for the coefficients, though it isn’t clear to me what that solution actually represents (i.e. what is really being minimized). Seems to me the constraint could also be included.
- To me the most fundamental approach is to represent the problem parametrically (i.e. x = f(t) and y=g(t)) but it isn’t clear (to me) how to represent a general conic parametrically and how the coefficients of the parametric equations relate to A,B,C,D,E,F.
- Another approach that might bear fruit (you alluded to it) is to take a normal parabola and apply a rotation and translation. You'd want to be careful to understand what the resulting least squares problem really represents.
- Might be helpful to know what kind of class this is for and how much effort is expected. Is this suppose to be a 1hr homework problem or a two week project?
-------------- EDIT ---------------------------------------------------------------
Did a little research.
http://research.microsoft.com/en-us/um/people/awf/ellipse/ellipse-pami.pdf
Turns out that given a general conic f = Ax
2+Bxy+Cy
2+Dx+Ey+F= 0, minimizing the equation \sum f^2 is considered minimizing the squared 'algebraic' distances from the points to the curve. By including the equality constraint B
2-4AC = 0 via Lagrange Multiplier, the problem can be solved in a straightforward fashion as a nonlinear least squares problem subject to a single equality constraint.
I made up a problem by rotating a parabola (with 1000 pts), added random noise to the points, set A=1, and solved for B, C, D, E, and F. Convergence was achieved in maybe a dozen iterations. All done wtih Excel using matrix functions (not very sexy, but gets the job done most of the time)