How Do You Calculate the Coordinates of a Point Based on Distance and Gradient?

AI Thread Summary
To calculate the coordinates of point B based on point A, the distance AB, and the gradient, first establish the relationship between the coordinates using the slope. By setting point A at (0,0), the line's equation becomes y=mx, where m is the slope. The distance from A to a point (x, mx) is expressed as √(x² + (mx)²) = x√(1 + m²). To find x, use the equation L = x√(1 + m²), where L is the distance. Once x is determined, the coordinates of B can be calculated by adding the differences to A's coordinates.
Inquisitus
Messages
12
Reaction score
0
Not actually homework, but just a general query...

I'm trying to write a program that will predict the trajectory of a moving point by analysing several positions through which the point has passed.

Part of this involves finding the coordiantes of a point B, given the coordinates of another point A, the length of AB and its gradient. Obviously there are two possible points... one to the right A and one to the left, but I'm only interested in the one to the right.

I basically need two simple formulae in terms of the x and y coordinates of B, which I can apply in the program itself, although I wouldn't mind knowing how the formula was reached as well :smile:

I've tried getting my head around it but I'm just getting confused. Any help would be greatly appreciated, thanks!
 
Last edited:
Physics news on Phys.org
I'm not sure what you mean by the gradient. Is that the slope of the line connecting A to B? Are you asking how to find a point B=(x2,y2) given a point A=(x1,y1), and the length L and slope m of AB?

If so, this can be found by pretending A is at (0,0) (you can add the coordinates of A back in at then end). Then the line containing AB has the equation y=mx, so that points along the line have coordinates like (x,mx). The distance from A to a point (x,mx) is \sqrt{x^2+m^2 x^2}=x\sqrt{1+m^2}. Then you want to solve for x using L= x\sqrt{1+m^2}. This x is the difference between the x values of A and B, and multiplying it by m gives the difference between the y values.
 
Last edited:
I picked up this problem from the Schaum's series book titled "College Mathematics" by Ayres/Schmidt. It is a solved problem in the book. But what surprised me was that the solution to this problem was given in one line without any explanation. I could, therefore, not understand how the given one-line solution was reached. The one-line solution in the book says: The equation is ##x \cos{\omega} +y \sin{\omega} - 5 = 0##, ##\omega## being the parameter. From my side, the only thing I could...
Back
Top