## Cartesian Dirac Delta from divergence of gradient...?

Hi, I've just found in an electrodynamics book a demonstration of Gauss' law involving a definition of Dirac's Delta I didn't know. Substantially, it states that:

$$-\nabla^{2}(\frac{1}{\left|x-x'\right|})=4\pi\delta(x-x')$$

(x and x' are vectors, of course).
I can see it somewhat makes sense, since the singularity is the only place where the modulus of the laplacian is the sum of two infinites, but I can't find a real proof. Can someone help me? Thanks.
 PhysOrg.com science news on PhysOrg.com >> City-life changes blackbird personalities, study shows>> Origins of 'The Hoff' crab revealed (w/ Video)>> Older males make better fathers: Mature male beetles work harder, care less about female infidelity

 Similar discussions for: Cartesian Dirac Delta from divergence of gradient...? Thread Forum Replies Calculus 3 Calculus & Beyond Homework 6 Calculus 12 Classical Physics 1 General Physics 4