Yay
- 6
- 0
Homework Statement
I'm making a program that tracks an object moving along the surface of a sphere (earth). I'm using a set of Cartesian coordinates who origin is the center of the Earth and Z directed through the north pole, X the greenwich meridian at the equator, and Y is 90 degrees to everyone else.
Given a Lat/Long value I can convert into a position vector by doing:
phi = (90 - lat)*pi/180
theta = long*pi/180
vector.x = sin(phi)*cos(theta)
vector.y = sin(phi)*sin(theta)
vector.z = cos(phi)
After moving my object around I need to convert it back into lat / long. At the moment I'm just trying to take a position in Lat/long, convert it into my position vector, draw it on a sphere , then convert it back.
I know my conversion TO the position vector is working properly since it points to the right spot on a sphere. What I can't get right is the conversion FROM the vector back to lat long.
Homework Equations
See above.
The Attempt at a Solution
I thought I could find phi from the Z component, then solve X and Y simultaneously for theta. So since Z = cos(phi), I thought easy phi = 1/cos(Z).
Only that didn't work. Sat down, thought for a bit, played around with cos and sec, and realized cos and sec aren't inverse operations (which I thought for sure they were) so sec(cos(a)) != a.
There's a simple little rule for finding going from A = cos(angle), to an expression for the angle, when you know A. Can someone please tell me what it is ? I've spent hours on something I thought I learned back in Highschool !