Divergence Theorem Question (Gauss' Law?)

Click For Summary
SUMMARY

The discussion centers on applying the Divergence Theorem to show that the integral of a continuous vector field F over R³ is zero. The problem suggests using a ball of radius r centered at the origin and taking the limit as r approaches infinity. Participants recommend converting to spherical coordinates for the integration and applying Gauss' Theorem for Divergence. The conclusion is that since the divergence of F approaches zero as the radius increases, the integral evaluates to zero.

PREREQUISITES
  • Understanding of the Divergence Theorem
  • Knowledge of spherical coordinates
  • Familiarity with vector calculus
  • Concept of limits in calculus
NEXT STEPS
  • Study the Divergence Theorem in detail
  • Learn about spherical coordinate transformations
  • Explore examples of vector fields and their divergences
  • Investigate the implications of limits in multivariable calculus
USEFUL FOR

Students and professionals in mathematics, physics, and engineering who are working with vector fields and integral theorems, particularly those interested in advanced calculus and mathematical proofs.

vector013
Messages
3
Reaction score
0
If F(x,y,z) is continuous and
ight%20%7C%5Cleq%20%5Cfrac%7B1%7D%7B%5Csqrt%7B%28x%5E2+y%5E2+z%5E2%29%5E3%7D+1%7D.gif
for all (x,y,z), show that R3
gif.latex?%5Cint%20%5Cint%20%5Cint.gif
gif.gif
dot F dV = 0

I have been working on this problem all day, and I'm honestly not sure how to proceed. The hint given on this problem is, "Take Br to be a ball of radius r centered at the origin, apply divergence theorem, and let the radius tend to infinity." I tried letting F = 1/((x2 +y2+z2)(3/2))+1), and taking the divergence of that, but it didn't really seem to get me anywhere. If anyone has any suggestions for at least how to set up this proof, I would really appreciate it.
 
Last edited:
Physics news on Phys.org
Well, from the title it seems that you know you should use divergence theorem. So why don't you?
 
Since your region of integration is a ball and the integrand involves x^2+ y^2+ z^2, I would recommend changing to spherical coordinates to do that integration over the surface of the ball.
 
Spherical was my thought too. I guess what's been confusing me is that the vector field isn't explicitly given; there's just the inequality which indicates it exists at (0,0,0). My thought was to use Gauss' Theorem for Divergence in Spherical Coordinates: div F = 1/p2*d/dp*(p2Fp)+1/(psin(phi))*d/dphi*(sin(phi)Fphi + 1/(psin(phi))*dFtheta/dtheta. Since you have 1/p terms, div F would go to zero as the radius, p, goes to infinity. Therefore, the integral of 0 is 0.
 
vector013 said:
Spherical was my thought too. I guess what's been confusing me is that the vector field isn't explicitly given; there's just the inequality which indicates it exists at (0,0,0). My thought was to use Gauss' Theorem for Divergence in Spherical Coordinates: div F = 1/p2*d/dp*(p2Fp)+1/(psin(phi))*d/dphi*(sin(phi)Fphi + 1/(psin(phi))*dFtheta/dtheta. Since you have 1/p terms, div F would go to zero as the radius, p, goes to infinity. Therefore, the integral of 0 is 0.
I don't see any application of divergence theorem in your explanation. This is how it should be:
## \int_V \nabla\cdot \vec F dV=\int_{\partial V} \vec F \cdot \hat n dA ##
Where ## \partial V ## is the boundary of V. Now because in the surface integral, F is evaluated only at the boundary and the boundary is at infinity and we know that F is smaller than a function which goes to zero at infinity, the integral should be zero.
 
Oh okay, that makes sense. I didn't even think of it that way. Thank you so much for the suggestion!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
7K
  • · Replies 4 ·
Replies
4
Views
7K
  • · Replies 11 ·
Replies
11
Views
6K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
2
Views
2K