- #1

- 880

- 0

Satellites A and B are traced at a particular moment by a space base located at the origin, with respect to which they move at constant velocities:

A0 = (a1,0,a3); B0 = (0,b2,b3)

V of satellite A = (Va,0,0); V of satellite B = (Vb,Vb,0)

Is the minimum distance between the satellites equal to the distance between the two skew vectors A = (a1,0,a3) + (Va,0,0)t and B = (0,b2,b3) + (Vb,Vb,0)t? Are they indeed skew?

If so, the distance is a3-b3, which does not seem reasonable to me (as over simpified and does not involve too many of the problem's parameters).

Any advice, please?