- #1
iamben
- 1
- 0
Hi everyone,
I'm no mathematician and I've found myself a bit stuck with this problem.
I have a sphere with a radius of one centred at the origin, and two straight lines that run through the centre. I also know, in spherical coordinates, the angles phi (horizontal rotation) and theta (vertical rotation) of each of those lines. These two lines define a circular plane within the sphere and I want to calculate the angle to each of those lines from some zero reference in the circular plane. I figure the obvious references are theta = pi/2 (because I'm not going to have a case where theta = 0 or pi) and phi = 0. However, I'm a bit stumped on how to calculate my angles from one of these references.
Anyone have any ideas?
Hopefully that makes sense.
Thanks.
I'm no mathematician and I've found myself a bit stuck with this problem.
I have a sphere with a radius of one centred at the origin, and two straight lines that run through the centre. I also know, in spherical coordinates, the angles phi (horizontal rotation) and theta (vertical rotation) of each of those lines. These two lines define a circular plane within the sphere and I want to calculate the angle to each of those lines from some zero reference in the circular plane. I figure the obvious references are theta = pi/2 (because I'm not going to have a case where theta = 0 or pi) and phi = 0. However, I'm a bit stumped on how to calculate my angles from one of these references.
Anyone have any ideas?
Hopefully that makes sense.
Thanks.