- #1
mariano_donat
- 8
- 0
Hi to everyone.
I'm detecting collision between two ellipses. I've got my unit vector, my ellipse center and radius (horizontal and vertical). I want to calculate the point that lies in the ellipse on the direction of the unit vector. See the image attached. Suppose the red arrow is my unit vector and I want to get the coordinates of the green colored point. I'm just multiplying my unit vector times my radius plus the center of the ellipse. The formula looks like this:
The point I get using the above formula lies on the ellipse, but it's translated on both axis a little bit, translated enough to detect collisions when haven't occurred any.
What am I missing here?
Thank you very much in advance.
I'm detecting collision between two ellipses. I've got my unit vector, my ellipse center and radius (horizontal and vertical). I want to calculate the point that lies in the ellipse on the direction of the unit vector. See the image attached. Suppose the red arrow is my unit vector and I want to get the coordinates of the green colored point. I'm just multiplying my unit vector times my radius plus the center of the ellipse. The formula looks like this:
Code:
//Assume unit vector has been already calculated at this stage, ellipseCenter and ellipseRadius has been given
Vector pointInEllipse = VectorMake(unitVector.x * ellipseRadius.x + ellipseCenter.x, unitVector.y * ellipseRadius.y + ellipseCenter.y);
The point I get using the above formula lies on the ellipse, but it's translated on both axis a little bit, translated enough to detect collisions when haven't occurred any.
What am I missing here?
Thank you very much in advance.