I'm not sure why this question comes to mind now, since I haven't had an E&M class for a few months now, but nonetheless. Place some charge at the origin. Surround the charge with a spherical Gaussian surface and calculate the surface integral. You obviously get a non-zero result(Gauss's Law), but this seems to violate the divergence theorem, equating the divergence of a field within a volume to the surface integral of the field over the surface enclosing the volume. This violates it because the divergence of the electric field is zero. So if the divergence of the charge's electric field within the spherical volume is zero, how can the surface integral be non-zero??? I know i'm missing something big here, but if someone can clarify where I went wrong with my reasoning, that would be great. Thanks.