Geometric proof: minimum angle to point in a line segment

Hi,

I have a problem I need to solve for a piece of software I'm writing, and I think I've got it but it would be great if somebody could take a quick look at this proof and see if I've overlooked anything. Thanks in advance.

Here is the problem: We're working in R_3 here. Given a line segment L defined by endpoints r and s, and a nonzero vector v, what is the minimum angle theta between v and any point on the line segment? We can assume the segment does not pass through the origin.

And here's what I came up with: If the line defined by r and s passes through the origin (outside the segment), all angles will be the same and clearly theta = the angle between v and r (or s). Otherwise r, s and the origin define a plane P. We can reduce the problem to a two dimensional analysis as follows. Let w be the projection of v into P. If w is zero, then all the vectors in L will be orthogonal to v, and so theta = pi/2.

For nonzero w, check if w lies between r and s. That is, if angle wr and angle ws are both <= angle rs. If so, then there exists a vector to some point in L that is just a scalar multiple of w, so theta is simply the minimum angle between v and P, ie angle vw (which will be less than pi/2). Similarly, if -w lies between r and s, then theta is the angle between v and -w.

If ±w lie outside of the angle between r and s, then the angle with v will be monotonic while traversing the line segment (this is correct, right?). Therefore the minimum angle will occur at one of the endpoints, and so theta is the minimum of angle vr and angle vs (which could be greater than pi/2).

Where I have written angle xy I will be calculating according to the usual formula acos(x . y / |x| |y|). I'm interested in the direction of v but not its length.

Okay, if you're still with me and have any thoughts for me, thank you!!

-Kyle

Last edited: