Hi All,(adsbygoogle = window.adsbygoogle || []).push({});

I'm struggling with finding a solution to an adjustment I'm working on. Thought someone else may have some thoughts?

I have a kinematic time series of X,Y positions for two points (X1,Y1,X2,Y2). I know that the two points were a distance D (e.g., 100 m) apart from each other (the constraint).

I'm attempting to find a solution that finds a best fit to X1,Y1 and to X2,Y2 based on the constraint that the distance between them is D. The path formed by the points is not linear -- I'm starting with a quadratic model.

Application: Picture two GPS receivers rigidly mounted on top of a car. They both have positions and both have noise in their position. I'd like to best fit to both of their positions, but with the constraint on that known distance between them.

In coming up with observation equations, my initial thought was something like:

X1(t) = At^2+Bt+C

Y1(t) = Dt^2+Et+F

X2(t) = Gt^2+Ht+I

Y2(t) = Jt^2+Kt+L

Then:

sqrt((X1-X2)^2+(Y1-Y2)^2) = 100

So this leaves me with 12 parameters. I have ample observations to solve this (observations are X1, Y1, X2, Y2).

Does it appear that I'm on the right track?

I'm not sure what the best method is to proceed from here (i.e. how to set up the LS problem). Any suggestions?

Thanks in advance for any help!

GGE

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Least squares adjustment/regression - two points known distance apart

**Physics Forums | Science Articles, Homework Help, Discussion**