# Cartesian Dirac Delta from divergence of gradient...?

by Gan_HOPE326
Tags: cartesian, delta, dirac, divergence, gradient
 P: 20 Hi, I've just found in an electrodynamics book a demonstration of Gauss' law involving a definition of Dirac's Delta I didn't know. Substantially, it states that: $$-\nabla^{2}(\frac{1}{\left|x-x'\right|})=4\pi\delta(x-x')$$ (x and x' are vectors, of course). I can see it somewhat makes sense, since the singularity is the only place where the modulus of the laplacian is the sum of two infinites, but I can't find a real proof. Can someone help me? Thanks.

 Related Discussions Calculus 3 Calculus & Beyond Homework 6 Calculus 12 Classical Physics 1 General Physics 4