I'm trying to determine the point, in 3D space, where an arbitrary line/ray intersects with an infinite plane. Using an article on Wikipedia, I tried to reproduce the presented formulas in code. This seems to work fine so long the ray is emitted from the origin of the coordinate system (0, 0, 0). As soon as I move the origin, the intersection point quickly starts to diverge from the plane in a curved path. My understanding of math is not good enough to figure out whether this is to be expected (i.e. a limitation of the formula), or an error in my code. I know this is not a programming forum, but maybe you can spot the mathematical error in my code, if there is one. The input variables p0, p1, l0, l1 have been verified to carry the correct values. Code (Java): /* plane normal */ vector p0; // origin position or normal (or point on plane, if you will) vector p1; // tip position of normal /* line/ray */ vector l0; vector l1; /* compute ray-plane intersection */ vector n = p1 - p0; // isolated plane normal float t = dot(p0 - l0, n) / dot(l1, n); vector hit = l0 + (l1 * t); // projects ray (e.g. rotations) correctly so long l0 is (0, 0, 0) Thanks.