# Potential difference/cone

bigplanet401
Hi,

A conical surface (an empty ice-cream cone) carries a uniform surface charge $$\sigma$$. The height of the cone is h, and the radius of the top is R. Find the potential difference between points a (the vertex) and b (the center of the top.)

I've tried integrating over the conical surface (zenith $$\phi$$ fixed):

$$V(\mathbf{r}) = \frac{1}{4\pi\epsilon_0} \int d\mathbf{a} \frac{\sigma(\mathbf{r}^\prime)}{|\mathbf{r} - \mathbf{r}^\prime|} \quad \rightarrow \quad \frac{1}{4\pi\epsilon_0} \int r^{\prime 2} dr^\prime \, d\theta^\prime \frac{\sigma}{\sqrt{1 - r^{\prime 2} \cos^2 \phi}} \, ,$$

but I think that's wrong. Next I tried building up from a series of rings with charge density $$\lambda$$:

$$V_{\text{ring}} = \frac{\lambda}{2 \epsilon_0} \frac{R}{\sqrt{R^2 + z^2}} \, ;$$

unfortunately, I don't know how to set up the integration for this. Any help is appreciated,hopefully sooner than later--my written qualifier is ~3 weeks away!

There shouldn't be any angular dependence in the integrand if you choose to use the first method, so the integral as you have it set up now is wrong. For each area element rdrd$\phi$ in the xy plane, think about the corresponding portion of the surface of the cone. Find the amount of charge in this surface element (this will involve the slope of the cone), and the distance from it to the point you want to calculate the potential. Then integrate over the disk in the xy plane corresponding to the cone.