Divergence Theorem Question (Gauss' Law?)

  • #1
3
0
If F(x,y,z) is continuous and
ight%20%7C%5Cleq%20%5Cfrac%7B1%7D%7B%5Csqrt%7B%28x%5E2+y%5E2+z%5E2%29%5E3%7D+1%7D.gif
for all (x,y,z), show that R3
gif.latex?%5Cint%20%5Cint%20%5Cint.gif
gif.gif
dot F dV = 0

I have been working on this problem all day, and I'm honestly not sure how to proceed. The hint given on this problem is, "Take Br to be a ball of radius r centered at the origin, apply divergence theorem, and let the radius tend to infinity." I tried letting F = 1/((x2 +y2+z2)(3/2))+1), and taking the divergence of that, but it didn't really seem to get me anywhere. If anyone has any suggestions for at least how to set up this proof, I would really appreciate it.
 
Last edited:

Answers and Replies

  • #2
Well, from the title it seems that you know you should use divergence theorem. So why don't you?
 
  • #3
Since your region of integration is a ball and the integrand involves [itex]x^2+ y^2+ z^2[/itex], I would recommend changing to spherical coordinates to do that integration over the surface of the ball.
 
  • #4
Spherical was my thought too. I guess what's been confusing me is that the vector field isn't explicitly given; there's just the inequality which indicates it exists at (0,0,0). My thought was to use Gauss' Theorem for Divergence in Spherical Coordinates: div F = 1/p2*d/dp*(p2Fp)+1/(psin(phi))*d/dphi*(sin(phi)Fphi + 1/(psin(phi))*dFtheta/dtheta. Since you have 1/p terms, div F would go to zero as the radius, p, goes to infinity. Therefore, the integral of 0 is 0.
 
  • #5
Spherical was my thought too. I guess what's been confusing me is that the vector field isn't explicitly given; there's just the inequality which indicates it exists at (0,0,0). My thought was to use Gauss' Theorem for Divergence in Spherical Coordinates: div F = 1/p2*d/dp*(p2Fp)+1/(psin(phi))*d/dphi*(sin(phi)Fphi + 1/(psin(phi))*dFtheta/dtheta. Since you have 1/p terms, div F would go to zero as the radius, p, goes to infinity. Therefore, the integral of 0 is 0.
I don't see any application of divergence theorem in your explanation. This is how it should be:
## \int_V \nabla\cdot \vec F dV=\int_{\partial V} \vec F \cdot \hat n dA ##
Where ## \partial V ## is the boundary of V. Now because in the surface integral, F is evaluated only at the boundary and the boundary is at infinity and we know that F is smaller than a function which goes to zero at infinity, the integral should be zero.
 
  • #6
Oh okay, that makes sense. I didn't even think of it that way. Thank you so much for the suggestion!
 

Suggested for: Divergence Theorem Question (Gauss' Law?)

Replies
2
Views
989
Replies
30
Views
2K
Replies
15
Views
699
Replies
4
Views
1K
Replies
1
Views
603
Replies
12
Views
920
Replies
2
Views
793
Back
Top