# Point At Given Distance in 3d Space

1. Sep 29, 2011

### Sothh

I am working on a simple ray tracer for rending point clouds in real time.

I am not so good with maths, and I am stuck with a fairly simple problem:

Given a start point (0,0,0) and a direction (0,90) and a distance 1-100, how do I get the 3d point the line will hit?

As this will go directly into code, it will be easier for me to use if no special math characters are used. (And easier for me to understand :)

Thanks!

2. Sep 29, 2011

### HallsofIvy

You have the origin in three dimensions, but I am not sure what "direction(0, 90)" means. Is that in "spherical coordinates" with $\theta] (the "longitude") equal to 0 and [itex]\phi$ (the "co-latitude") so that is directed toward the positive x- axis? But then what "third point" are you talking about? You have only mentioned one point, the origin. And what do you mean by "a distance 1-100"? A distance is a single number.

If you mean "the point at distance 1 from the origin in the direction of the positive x-axis", that is, of course, (1, 0, 0). If you mean the point at distance 100 from the origin in that direction, that is (100, 0, 0). If you mean some point at distance x, where x is between 1 and 100, from the origin in that direction, that is (x, 0, 0).

3. Sep 29, 2011

### Sothh

Sorry, the direction is the longitude and latitude, or pitch and yaw.

1-100 is a single number, that may range between 1 to 100 (or more.)

I need the algorithm to find the 3d point that the line ends at when given a starting point, a direction, and a distance from the starting point.

4. Sep 29, 2011

### paulfr

This document might help if you can work in Cartesian, Vector, or Parameter Forms of 3D Equations for Lines and Planes