- #1
kairosx
- 9
- 0
Homework Statement
Show that $$div ( \frac{x}{|x|^2} ) = 2 \pi \delta_{(0,0)}$$ with ## x \in R^2 \ \{ 0 \} ## and ## \delta_{(0,0)} ## beeing the dirac delta distribution with pole in ## (0,0) ##.
Homework Equations
## div (f(x)) = \nabla \cdot (f(x)) = f_{x_1} + f_{x_2} ##
The distribution derivative: ## <f'(x),\phi(x)> = - <f(x),\phi'(x)> ## with ##<u,v>## defined as ## \int_{R^2} u v dx ## and ## \phi ## beeing a test function ## \phi \in c^{\infty}(R^2) ## and ## supp(\phi) ## compact
The Attempt at a Solution
I started with $$ f_{x_1} = \frac{d}{dx_1} (x \cdot |x|^{-2}) = \hat{e}_{x1} \frac{1}{|x|^2} + x \frac{d}{dx_1} (\frac{1}{|x|^2}) $$
Together with ## f_{x_1} ## I need to compute ## \frac{d}{dx_1} (\frac{1}{|x|^2}) + \frac{d}{dx_2} (\frac{1}{|x|^2}) = \nabla \cdot (\frac{1}{|x|^2}) ## which is in polar coordinates ## \frac{d}{dr} (\frac{1}{r^2}) = \frac{-2}{r^3} ## because the problem is rotationally symmetric and therefore all derivatives of the angles vanish. Alltogehter that's $$ f_{x_1} + f_{x_2} = (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2} + x(\frac{-2}{|x|^3}) = (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2} - 2 (\hat{e}_{x1} + \hat{e}_{x2} )(\frac{1}{|x|^2}) = - (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2}$$.
But that's not the solution I am looking for? Actually I was hoping the result would be ## 0 ## because the delta distribution is ## 0 ## everywhere except at ## x = 0 ##. Then I would have tried to somehow integrate over the unit ball around zero and show that the result is ## 2 \pi ##. But now I'm confused and don't see what to do next. I guess I did something wrong with the polar coordinates but I'm not sure.