A space base has traced stallites A and B at a particular moment at:(adsbygoogle = window.adsbygoogle || []).push({});

A0 = (a1,0,a3) B0 = (0,b2,b3)

whereas the base itself is located at the origin (0,0,0) and the satellites move at constant velocities with respect to the base:

Va = (Va,0,0) Vb = (Vb,Vb,0)

The minimum distance between satellites A and B ought to be computed.

My proposed solution:

OA = A0 + Va * t = (a1,0,a3) + (Va,0,0)t

OB = B0 + Vb * t = (0,b2,b3) + (Vb,Vb,0)t

I found the minimum distance to be:

SQRT[1/2(a1+b2)^2 + (b3-a3)^2]

Am I correct? Could someone please confirm?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Minimum distance between two satellites

**Physics Forums | Science Articles, Homework Help, Discussion**