- #1
bigevil
- 79
- 0
Homework Statement
This is just a general question. My fundamentals aren't very solid because I'm studying on my own at the moment.
[tex]\int_V (\triangledown \cdot \bold{v}) dV = \int_S \bold{v} d\bold{a} [/tex]
I am trying to find out the sign of the area integral on a surface defined by spherical polar coordinates. What I learned from my current text is that for Cartesian coordinates it is easy to find out the sign by tracing the outward direction of the surface. For instance, for the base of a cube on the x-y axis, da is in the negative z-direction.
But what about for spherical polar coordinates? For instance, a quarter sphere defined by radius r, polar angle (theta) 0 to π/2 and azimuthal angle (phi) 0 to π/2. This gives us a curved area and two plane surfaces, one in the x-y axis, one in the y-z axis and one in the x-z axis. Which of these then are positive and negative?
2. After some thought
I was thinking why not just define the polarity as I would in Cartesian coordinates.Then the x-z and x-y ones would be negative, the rest positive. But I'm not sure if I'm on the right track here. Surely you'd imagine that my book would have something to say about that if it was really the case.
Another way I am thinking about is why not define it along the [tex]\hat{\bold{\phi}}[/tex] and [tex]\hat{\bold{\theta}}[/tex]. The examples I worked on in the book say something like [tex]d\bold{a} = - dx dy \hat{\bold{z}}[/tex], which tacks onto the directions of the coordinate axes. But surely it is too much trouble to deduce the direction of the individual vectors.
Last edited: