# Homework Help: Poisson equation in R with a source at the origin

1. Oct 7, 2015

### S. Moger

1. The problem statement, all variables and given/known data

Solve the poisson eq. on R with a source in x=0.

3. The attempt at a solution

I haven't done this kind of thing in years, so I'm a bit rusty, but I think that this is requested:

$\Delta \phi = - \rho \delta(x)$ (Edit: no wait, I need an integral here).

It doesn't seem to be a big deal. I just integrate both sides on R.

$\frac{\partial\phi}{\partial x} = - \rho \int \delta(x) dx + A$

And again
$\phi = - \rho x + Ax + B$

However, the key wants me to realize that the answer really is this (they seem to use q instead of rho):
$\phi = - \frac{1}{2} q |x| + Ax + B$

But why? Am I doing some illegal operation here?

2. Oct 7, 2015

### andrewkirk

I think the problem is that your integrals are indefinite, but you are treating them like definite integrals.
When you integrate the first time, you are treating the result of the integrated delta function as a constant. But it's not, it's a step function.

Note also that if you double differentiate your formula, you get zero, not a Dirac delta. So it's not a solution.

I suggest using explicit definite integrals by writing as follows:

$$\Delta \phi = - \rho \delta(x)$$
then choose $a<0$ and do
$$\int_a^x\Delta\phi(u)\,du= - \rho \int_a^x\delta(u)\,du$$

and see what happens. If you get a step function on the RHS, you know you're on the right track.

3. Oct 8, 2015

### S. Moger

Can I actually write like this?

$\Delta \phi = - \rho \delta(x)$ if I treat rho as a constant?

Wouldn't it be more correct to write an integral containing delta there to avoid the infinity problem there?

Anyway,

$\Delta \phi = - \rho \delta(x)$

$\int_a^x\Delta\phi(u)\,du= - \rho \int_a^x\delta(u)\,du$ and a<0

$\frac{\partial \phi}{\partial x} - (\frac{\partial \phi}{\partial x}) (a) = - \rho \Theta(x)$ (please correct me if I'm wrong).

I believe the second part of the lhs is a constant because phi should only depend on x. I'll call it A.

$\int_b^x\partial_x\phi(u)\,du= - \rho \int_b^x \Theta(u)\,du + A(x-b)$ and b<0

$\phi(x) = - \rho x \Theta(x) + Ax + B$

And now I approach 0 with upper limit a>0, and combine the solutions?? The half arises from the overlap?

4. Oct 8, 2015

### andrewkirk

There's no need to do that, as $a$ has effectively disappeared from your equation, so there's no dependence on $a$. Although $A$ and $B$ may be functions of $a$, they are just arbitrary integration constants because the equation is still a solution if you add any $Cx+D$ to it.

However, what are you meaning by $\Theta(x)$, and why are you treating it as a constant when you integrate it? You need to be more explicit there in order to ensure validity.

5. Oct 12, 2015

### S. Moger

Here, $\Theta(x)$ is meant to be the Heaviside step function. When I integrate delta from a to x, there are three cases. $x < a, a \leq x<0, a<0 \leq x$. For x less than 0, I should get a zero, and for x greater than or equal to zero I should get a 1 from the definition as the 0 is inside the interval (with a possible exception of where x=0). I think that simplifies to the Heaviside step function.

Then to find its primitive function I visualize it much in the same way. x below zero should return a zero, and x above zero should be 1 times x. To combine both parts I multiply the heaviside step function by x (i.e. zero below zero and x above zero).

But there must be some error in my reasoning here, as that doesn't correspond to the right answer.

6. Oct 12, 2015

### Ray Vickson

If $\Delta \phi$ means $d^2 \phi(x)/dx^2$, then you just have the equation for the Green's function of the differential operator $(d/dx)^2$. If you solve $f''(x) = \delta(x)$, you can re-scale to get your function $\phi(x)$.

The equation $f''(x) = \delta(x)$ gives $f''(x) = 0$ for $x < 0$ and for $x > 0$. Assuming $f$ is continuous at $x = 0$, we can integrate both sides of the DE from $-\epsilon$ to $+\epsilon$, then take the limit as $\epsilon \to 0+$. This gives
$$\lim_{\epsilon \to 0+} f'(\epsilon) - f'(-\epsilon) = \lim_{\epsilon \to 0+} \int_{-\epsilon}^{\epsilon} \delta(x) \, dx ,$$
or
$$f'(0+) - f'(0-) = 1.$$
So, if
$$f(x) = \begin{cases} a_1 x + b_1, & x < 0\\ a_2 x + b_2, & x > 0 \end{cases}$$
we get conditions on $a_i,b_i$ by requiring that $f$ be continuous at $x=0$ and have a jump discontinuity of +1 in $f'(x)$ at $x = 0$.

Of course, that gives only two conditions on the four parameters, so that leaves lots of room for other conditions, such as boundary conditions at $\pm \infty$ and the like.

Last edited: Oct 12, 2015
7. Oct 12, 2015

### andrewkirk

Actually your answer is fine. Although it is far from obvious, it defines the same function as the book answer.

To see that, just equate the two functions, taking care to label the constants differently, like so:
$$- \frac{1}{2} q |x| + A'x + B' = - \rho x \Theta(x) + Ax + B$$
Can you solve this to find what values of $A',B', q$ (expressed in terms of $\rho, A, B$) will make the functions identical?

8. Oct 15, 2015

### S. Moger

Thanks

I can show it by evaluating the cases x>0 and <0 separately. B' = B and if rho=q , A = A'+ 1/2 q .

But how did the book arrive at that particular formulation. There must have been some other kind of approach to the problem?

9. Oct 15, 2015

### andrewkirk

Not necessarily. It may just be that they wanted to use familiar notation. The absolute value function is introduced in early secondary school mathematics, whereas the Heaviside function is generally not encountered until uni. Indeed, I didn't recognise it when you used it.

Expressing it using the absolute value function is just a neat way to express it very compactly without having to use more advanced notation.

Having said that, I find your Heaviside expression of the function more intuitive, but maybe that's just me.