Proving Div(x/|x|^2) = 2πδ(0,0) in 2D with Distribution Derivative

  • Thread starter kairosx
  • Start date
  • Tags
    Divergence
In summary: Thats something with laplace and polar coordinates but... i don't know. Are you sure the divergence of the vector field ##x/|x|^2## is not ##-2/(x_1^2+x_2^2)##, because that's what I get when I treat it as a 1D problem.To compute the divergence of a vector field, you have to compute its components and add them up. You have correctly computed the components, but not the sum of them. Also, you are not using the definition of the distributional derivative.The definition of the distributional derivative is given by$$<\dd{\phi}{x_i},f
  • #1
kairosx
9
0

Homework Statement


Show that $$div ( \frac{x}{|x|^2} ) = 2 \pi \delta_{(0,0)}$$ with ## x \in R^2 \ \{ 0 \} ## and ## \delta_{(0,0)} ## beeing the dirac delta distribution with pole in ## (0,0) ##.

Homework Equations


## div (f(x)) = \nabla \cdot (f(x)) = f_{x_1} + f_{x_2} ##
The distribution derivative: ## <f'(x),\phi(x)> = - <f(x),\phi'(x)> ## with ##<u,v>## defined as ## \int_{R^2} u v dx ## and ## \phi ## beeing a test function ## \phi \in c^{\infty}(R^2) ## and ## supp(\phi) ## compact

The Attempt at a Solution


I started with $$ f_{x_1} = \frac{d}{dx_1} (x \cdot |x|^{-2}) = \hat{e}_{x1} \frac{1}{|x|^2} + x \frac{d}{dx_1} (\frac{1}{|x|^2}) $$
Together with ## f_{x_1} ## I need to compute ## \frac{d}{dx_1} (\frac{1}{|x|^2}) + \frac{d}{dx_2} (\frac{1}{|x|^2}) = \nabla \cdot (\frac{1}{|x|^2}) ## which is in polar coordinates ## \frac{d}{dr} (\frac{1}{r^2}) = \frac{-2}{r^3} ## because the problem is rotationally symmetric and therefore all derivatives of the angles vanish. Alltogehter that's $$ f_{x_1} + f_{x_2} = (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2} + x(\frac{-2}{|x|^3}) = (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2} - 2 (\hat{e}_{x1} + \hat{e}_{x2} )(\frac{1}{|x|^2}) = - (\hat{e}_{x1} + \hat{e}_{x2} ) \frac{1}{|x|^2}$$.
But that's not the solution I am looking for? Actually I was hoping the result would be ## 0 ## because the delta distribution is ## 0 ## everywhere except at ## x = 0 ##. Then I would have tried to somehow integrate over the unit ball around zero and show that the result is ## 2 \pi ##. But now I'm confused and don't see what to do next. I guess I did something wrong with the polar coordinates but I'm not sure.
 
Physics news on Phys.org
  • #2
Why are you not applying the definition of the distributional derivative that you put into your relevant equations?

kairosx said:
Together with ## f_{x_1} ## I need to compute ## \frac{d}{dx_1} (\frac{1}{|x|^2}) + \frac{d}{dx_2} (\frac{1}{|x|^2}) = \nabla \cdot (\frac{1}{|x|^2}) ## which is in polar coordinates ## \frac{d}{dr} (\frac{1}{r^2}) = \frac{-2}{r^3} ## because the problem is rotationally symmetric and therefore all derivatives of the angles vanish.

It is not clear to me what you are trying to do here. Are you trying to compute the divergence (which would not make sense since ##1/r^2## is a scalar field) or the gradient (which is ##\vec e^i \partial_i \phi## for a scalar field ##\phi##)?

I suggest that you apply the definition of the distributional derivative instead.
 
  • #3
You mean like
$$\frac{d}{dx_1} (\frac{1}{|x|^2}) = < \frac{d}{dx_1} \frac{1}{|x|^2},\phi(x)> = - <\frac{1}{|x|^2}, \frac{d}{dx_1} \phi(x)> = -\int_{-\infty}^{\infty} \frac{1}{|x|^2} \frac{d}{dx_1} \phi(x) dx = - \int_{0}^{\infty} \frac{1}{x^2} \frac{d}{dx_1} \phi(x) dx - \int_{-\infty}^{0} \frac{1}{(-x)^2} \frac{d}{dx_1} \phi(x) dx .$$
I'm actually not sure about the last step, because what is ## x^2 ## when x is a vector? I guess I would have to define that by myself as ## x^2 = x_1^2 + x_2^2 ##? I would then go on with
$$ - \int_{0}^{\infty} \frac{1}{x_1^2 + x_2^2} \frac{d}{dx_1} \phi(x) dx - \int_{-\infty}^{0} \frac{1}{x_1^2 + x_2^2} \frac{d}{dx_1} \phi(x) dx =
\int_{0}^{\infty} \frac{d}{dx_1} \frac{1}{x_1^2 + x_2^2} \phi(x) dx + \int_{-\infty}^{0} \frac{d}{dx_1} \frac{1}{x_1^2 + x_2^2} \phi(x) dx =
\int_{0}^{\infty} \frac{-2 x_1}{(x_1^2 + x_2^2)^2} \phi(x) dx + \int_{-\infty}^{0} \frac{-2 x_1}{(x_1^2 + x_2^2)^2} \phi(x) dx
$$
but here I'm stuck again. I just don't know how do deal with those integrals?
 
  • #4
The integral is with respect to ##x_1## and ##x_2##, not only with respect to some ##x##. Also, you should use the definition of the distributional derivative for the vector field, not for ##1/x^2##.
 
  • #5
Actually by saying that ## x \in R^2 ## i always mean ## \int dx := \int d(x_1,x_2) ##, i definately should have stated that explicitely. Okay, i going to start from scratch with the distributional derivative for my vector field :-)

$$ <\frac{d}{dx_1} \frac{x}{|x|^2} + \frac{d}{dx_2} \frac{x}{|x|^2}, \phi(x_1,x_2)> =
\int \int ( \frac{d}{dx_1} \frac{x}{|x|^2} + \frac{d}{dx_2} \frac{x}{|x|^2} ) \phi(x_1,x_2) d(x_1,x_2) =
\int \int ( \frac{d}{dx_1} \frac{x}{|x|^2} + \frac{d}{dx_2} \frac{x}{|x|^2} ) \phi(x_1,x_2) d(x_1,x_2) = ...$$

and with ## |x|^2 = x_{1}^{2} + x_{2}^{2}## and ## x = (x_1,x_2)^T ##

$$ \int ( \frac{d}{dx_1} \frac{x_1}{x_{1}^{2} + x_{2}^{2}} + \frac{d}{dx_2} \frac{x_1}{x_{1}^{2} + x_{2}^{2}} ) \phi(x_1,x_2) d(x_1) + \int ( \frac{d}{dx_1} \frac{x_2}{x_{1}^{2} + x_{2}^{2}} + \frac{d}{dx_2} \frac{x_2}{x_{1}^{2} + x_{2}^{2}} ) \phi(x_1,x_2) d(x_2) = ... $$

Considering the identities
## \frac{d}{dx_1} \frac{x_1}{x_{1}^{2} + x_{2}^{2}} = \frac{1 \cdot (x_{1}^{2} + x_{2}^{2}) - x_1 \cdot 2 x_1}{(x_{1}^{2} + x_{2}^{2})^2} = \frac{(x_{2}^{2} - x_{1}^{2})}{(x_{1}^{2} + x_{2}^{2})^2} ##
## \frac{d}{dx_2} \frac{x_1}{x_{1}^{2} + x_{2}^{2}} = \frac{0 \cdot (x_{1}^{2} + x_{2}^{2}) - x_1 \cdot 2 x_2}{(x_{1}^{2} + x_{2}^{2})^2} = - \frac{2 x_1 x_2}{(x_{1}^{2} + x_{2}^{2})^2} ##
## \frac{d}{dx_1} \frac{x_2}{x_{1}^{2} + x_{2}^{2}} = - \frac{2 x_1 x_2}{(x_{1}^{2} + x_{2}^{2})^2} ##
## \frac{d}{dx_2} \frac{x_2}{x_{1}^{2} + x_{2}^{2}} = \frac{(x_{1}^{2} - x_{2}^{2})}{(x_{1}^{2} + x_{2}^{2})^2} ##

$$ \int ( \frac{(x_{2}^{2} - x_{1}^{2})}{(x_{1}^{2} + x_{2}^{2})^2} - \frac{2 x_1 x_2}{(x_{1}^{2} + x_{2}^{2})^2} ) \phi(x_1,x_2) d(x_1) + \int (- \frac{2 x_1 x_2}{(x_{1}^{2} + x_{2}^{2})^2} + \frac{(x_{1}^{2} - x_{2}^{2})}{(x_{1}^{2} + x_{2}^{2})^2} ) \phi(x_1,x_2) d(x_2) = ... ?$$

But I actually don't see where this is going. I again got two integrals which I can't solve.
 
  • #6
Let me stop you from the beginning. The divergence of ##x/|x|^2## is not given by
$$
\newcommand{\dd}[2]{\frac{\partial #1}{\partial #2}}
\dd{(x/|x|^2)}{x_1} + \dd{(x/|x|^2)}{x_2}
$$
it is given by
$$
\dd{(x_1/|x|^2)}{x_1} + \dd{(x_1/|x|^2)}{x_2}.
$$
Apart from that, you still have not utilised the definition of the distributional derivative. Also, once you do, there is a very useful coordinate system called polar coordinates.
 
  • #7
Relatively minor point: why are you writing ## \frac{x}{|x|^2}##? Since ##|x|^2 = x^2##, the fraction is the same as ##\frac 1 x##.
 
  • #8
Mark44 said:
Relatively minor point: why are you writing ## \frac{x}{|x|^2}##? Since ##|x|^2 = x^2##, the fraction is the same as ##\frac 1 x##.
But ## x \in R^2 ## is a vector ## x = (x_1,x_2)^T ## and how is ## \frac{1}{vector} ## is defined?

Okay, i started once again:
##x \in R^2## is a vector with the components ##x = (x_1,x_2)^T##

$$
\frac{ \partial (x_1/|x|^2) }{\partial x_1} + \frac{ \partial (x_2/|x|^2)}{ \partial x_2} =\\
- <\frac{x_1}{|x|^2},\phi_{x_1}(x)> - <\frac{x_2}{|x|^2},\phi_{x_2}(x)> = ...
$$
using polar coordinates: ##x_1 = r cos \theta##, ##x_2 = r sin \theta## and especially ##|x| = r## and not forgetting the jacoby determinant which is ##r## in 2 dimensions

$$
<-\frac{r cos \theta}{r^2} r,\phi_{x_1}(x)> - <\frac{r sin \theta}{r^2} r,\phi_{x_2}(x)> =\\
-\int_{0}^{2 \pi} \int_{0}^{\infty} cos \theta \phi_{x_1}(x) + sin \theta \phi_{x_2}(x) dr d \theta = ...\\
$$

but now I don't know how to express ##\phi_{x_1}(x)## and ##\phi_{x_2}(x)## in polar coordinates. How do I express ##\frac{d}{dx_1}## in polar coordinates? ##\frac{d}{d(r cos \theta)}## isn't really helpful as long as I not just have expressions that explicitely contain ##(rcos \theta)## ?

Another approch I did was:

$$
-\int_{0}^{2 \pi} \int_{0}^{\infty} (cos \theta,sin \theta) \space div(\phi(x)) dr d \theta = \\
\int_{0}^{2 \pi} \int_{0}^{\infty} div(cos \theta,sin \theta) \space \phi(x) dr d \theta =... \\
$$

and the divergence in polar coordinates is ## div(x) = \nabla \cdot x = \frac{1}{r} \frac{d(r x_r)}{ dr} + \frac{1}{r} \frac{d(x_{\theta})}{ d \theta}## and I would try to build the divergence of the vector, but I already assumed the vector to have a cartesian basis, why I shouldn't apply the divergence in polar coordinates I guess. But if I applied it in cartesian coordinates the expression gets really messy and I don't think that's the way I am supposed to solve the problem.
 
  • #9
kairosx said:
but now I don't know how to express ϕx1(x)ϕx1(x)\phi_{x_1}(x) and ϕx2(x)ϕx2(x)\phi_{x_2}(x) in polar coordinates
Have you heard of the chain rule?
 
  • #10
Mark44 said:
Relatively minor point: why are you writing ## \frac{x}{|x|^2}##? Since ##|x|^2 = x^2##, the fraction is the same as ##\frac 1 x##.
kairosx said:
But ## x \in R^2 ## is a vector ## x = (x_1,x_2)^T ## and how is ## \frac{1}{vector} ## is defined?
It wasn't clear to me that you were working with a vector here.
 
  • #11
Also, some LaTeX pointers:

< becomes ##<##, a relation (less than) and is typeset as such. For inner products, use \langle ##\langle## and \rangle ##\rangle##, which are delimiters.

R becomes ##R##, typically used to represent numbers such as a radius. \mathbb R becomes ##\mathbb R##, the notation for the set of real numbers.

To avoid confusion, it is often good to use vector arrows or boldface to represent vectors. For example \vec v (##\vec v##) or \boldsymbol v (##\boldsymbol v##).

It is often good to define commands using \newcommand. It works here too. I defined \newcommand{\dd}[2]{\frac{\partial #1}{\partial #2}} in an earlier post. It will work in any post here until a second page is generated. For example \dd{f}{x} becomes
$$
\dd{f}{x}
$$
 
  • #12
Okay, thanks for the tipps about the notation and sorry again for the confusion about the vector identity! :-) I tried to come forward now by considering
## r = \sqrt{x_1^2 + x_2^2} ##
## \theta = arctan(\frac{x_2}{x_1}) ##
## \frac{dr}{dx_1} = \frac{ x_1}{\sqrt{x_1^2 + x_2^2}} = \frac{ x_1}{r} = \frac{ r cos \theta}{r} = cos \theta ##
## \frac{d \theta}{dx_1} = \frac{1}{1 + (\frac{x_2}{x_1})^2} \cdot \frac{-x_2}{x_1^2} = \frac{-x_2}{x_1^2 + x_2^2} = \frac{-x_2}{r^2} = \frac{-r sin \theta}{r^2} = \frac{-sin \theta}{r}##
## \frac{dr}{dx_2} = sin \theta ##
## \frac{d \theta}{dx_2} = \frac{1}{1 + (\frac{x_2}{x_1})^2} \cdot \frac{1}{x_1} = \frac{cos \theta}{r}##

Then I can write

\begin{equation}
\frac{d}{dx_1} \phi(r(x,y), \theta(x,y)) = \phi_r \frac{dr}{dx_1} + \phi_{\theta} \frac{d \theta}{dx_1} = \phi_r cos \theta + \phi_{\theta} \frac{-sin \theta}{r}
\end{equation}

respectively

\begin{equation}
\frac{d}{dx_2} \phi(r(x,y), \theta(x,y)) = \phi_r \frac{dr}{dx_1} + \phi_{\theta} \frac{d \theta}{dx_1} = \phi_r sin \theta + \phi_{\theta} \frac{cos \theta}{r}
\end{equation}

and by inserting it into the former equation\begin{align}
-\int_{0}^{2 \pi} \int_{0}^{\infty} cos \theta \phi_{x_1}(x) + sin \theta \phi_{x_2}(x) dr d \theta &=\\
-\int_{0}^{2 \pi} \int_{0}^{\infty} cos \theta (\phi_r cos \theta + \phi_{\theta} \frac{-sin \theta}{r}) + sin \theta ( \phi_r sin \theta + \phi_{\theta} \frac{cos \theta}{r}) dr d \theta &=\\
-\int_{0}^{2 \pi} \int_{0}^{\infty} \phi_r + \phi_{\theta}\frac{- cos \theta sin \theta}{r} + \phi_{\theta} \frac{cos \theta sin \theta}{r} dr d \theta &=\\
-\int_{0}^{2 \pi} \int_{0}^{\infty} \phi_r dr d \theta &=\\
-2 \pi [\phi(\infty) - \phi(0)] =
2 \pi \phi(0) = 2 \pi \langle \delta_{(0,0)}, \phi(x) & \rangle
\end{align}

And that's it, I guess? :-D Thank you so much for your time and help! :-)
 
  • #13
While you got it correctly, keep in mind that it is actually easier (although this may be subjective) to express ##x_1## and ##x_2## in terms of ##r## and ##\theta## and differentiate those expressions wrt ##x_1## and ##x_2## and then solve for the partial derivatives of ##r## and ##\theta##. Maybe it is just me who does not like the inversion formulas in terms of the square root and arctan...
 
  • #14
Orodruin said:
While you got it correctly, keep in mind that it is actually easier (although this may be subjective) to express ##x_1## and ##x_2## in terms of ##r## and ##\theta## and differentiate those expressions wrt ##x_1## and ##x_2## and then solve for the partial derivatives of ##r## and ##\theta##. Maybe it is just me who does not like the inversion formulas in terms of the square root and arctan...
I'm not sure if I understand what you mean. Actually, I really don't like the square and arctan expressions either, but how else would you do it? You mean expressing ##x_1## and ##x_2## in terms of ##r## and ##\theta## like ## x_1 = r cos \theta ## and ## x_2 = r sin \theta ##? But if I then differentiate ## \frac{ dx_1(r, \theta) }{dx_1} ##(??) and ## \frac{ dx_2(r, \theta) }{dx_2} ##(??) what would I expect to get?
 
  • #15
Differentiate both sides of the relations considering ##x_i## as independent variables and r and theta as functions of those variables.
 

1. What is the meaning of "divergence" in the equation x/abs(x)^2?

Divergence in this context refers to the behavior of the function as the input values approach a certain point or value. In the case of x/abs(x)^2, the function is said to have a divergence at x=0.

2. How is divergence calculated in this equation?

In this equation, divergence is calculated by taking the limit of the function as the input value approaches the point of interest, in this case x=0. This means that we are looking at the behavior of the function as it gets closer and closer to x=0.

3. What is the significance of the absolute value in the numerator of the equation?

The absolute value in the numerator ensures that the function is continuous at x=0. Without it, the function would have a "hole" at x=0, making it non-continuous and therefore undefined at that point.

4. What is the behavior of the function at x=0?

At x=0, the function x/abs(x)^2 has a divergence, meaning that it does not have a defined value at that point. As the input value gets closer and closer to x=0, the function approaches infinity.

5. How does this function relate to other mathematical concepts, such as limits and derivatives?

This function is closely related to the concept of a limit, as it involves taking the limit of the function at a specific point. It is also related to the derivative, as the derivative of x/abs(x)^2 is undefined at x=0 due to the function's discontinuity at that point.

Similar threads

  • Calculus and Beyond Homework Help
Replies
20
Views
462
  • Calculus and Beyond Homework Help
Replies
9
Views
551
  • Calculus and Beyond Homework Help
Replies
1
Views
446
  • Calculus and Beyond Homework Help
Replies
9
Views
772
  • Calculus and Beyond Homework Help
Replies
5
Views
620
  • Calculus and Beyond Homework Help
Replies
3
Views
564
  • Calculus and Beyond Homework Help
Replies
5
Views
766
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
710
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
Back
Top