Let C be the circle with center (a/2, 0) and radius a/2. Let L be the line with equation x=a, find the polar equation of the curve produced in the following manner: FOr every angle [tex]\theta[/tex], -[tex]\pi/2[/tex] <[tex]\theta[/tex] < [tex]\pi/2[/tex], consider the ray from the origin that makes the angle [tex]\theta[/tex] with the positive x axis. This ray will intersect C at a point A and L at a point B. Let O be the origin. Then the point P on the ray is on the curve if line segment OP = line segment AB.
If I trace it out, which I did in my example, it begins to look like a centroid. However I don't know how to verify this. I'm not sure exactly how to systematically approach this problem.
My idea is to break the equation of the resulting curve into two parts: first the length of the segment from the origin to when it hits the boundary of the circle, then the distance from the circle to the vertical line; the resulting curve being the difference between the former and the latter. I believe the distance of the line segment from O to A is simply cos[tex]\theta[/tex]. I'm not sure how I'd express the rest of it though.
Last edited by a moderator: