- #1
vaishakh
- 334
- 0
Can anyone here give some hint to solve this question. I cannot proceed much in solving this question.
Point A moves with constant velocity v so that the vector v is continually pointed towards the point B which in turn is in a rectilinear motion with a uniform velocity u < v. at the initial moment of time vector v is perpendicular to the vector u and the points are separated by a distance of l. how soon the points will converge?
The problem that I face with this problem is that there is no definition of when does A start turning and what will be the direction and the distance between A and B when A starts turning. In fact A turns constantly(I know that).
Point A moves with constant velocity v so that the vector v is continually pointed towards the point B which in turn is in a rectilinear motion with a uniform velocity u < v. at the initial moment of time vector v is perpendicular to the vector u and the points are separated by a distance of l. how soon the points will converge?
The problem that I face with this problem is that there is no definition of when does A start turning and what will be the direction and the distance between A and B when A starts turning. In fact A turns constantly(I know that).