Higgy
- 43
- 0
Homework Statement
I'm doing a problem that involves expressing, for two arbitrary vectors \vec{x} and \vec{x'},
|\vec{x}-\vec{x'}|
in spherical coordinates (\rho,\theta,\phi).
Homework Equations
Law of Cosines:
c^{2}=a^{2}+b^{2}-2ab\cos\gamma
where \gamma is the angle between a and b.
The Attempt at a Solution
Using the law of cosines, we can write
|\vec{x}-\vec{x'}|=(\rho^{2}+\rho'^{2}-2\rho\rho'\cos\gamma)^{\frac{1}{2}}
but I can't figure out what the angle between the vectors \gamma would be. I imagine a plane formed by the two vectors, and the angle being between the two vectors in that plane. How I describe this mathematically, though, is confusing. Can anyone push me in the right direction?
EDIT:
Alright, I took a new approach. I decided to express the difference in Cartesian coordinates, and then convert to spherical coordinates. In doing so, I get:
|\vec{x}-\vec{x'}|=(\rho^{2}+\rho'^{2}-2\rho\rho'[\sin\theta\sin\theta'(\cos\phi\cos\phi'+\sin\phi\sin\phi')+\cos\theta\cos\theta'])^{\frac{1}{2}}
which gives me the \gamma I was looking for. It would be nice if there were an easy way to simply that, though...
Last edited: