gavman
- 4
- 0
Homework Statement
A hollow spherical shell carries charge density \rho=\frac{k}{r^2} in the region a<=r<=b, where a is the inner radius and b is the outer radius. Find the electric field in the region a<r<b.
I'm not allowed to use integral form of Gauss's law, must use differential form.
Homework Equations
Relating charge density to electric field divergence \vec{∇}\cdot\vec{E}=\frac{\rho}{\epsilon_{0}}
The Attempt at a Solution
Using the integral method, I believe the electric field is \vec{E}(\vec{r})=\frac{k}{\epsilon_{0}}\frac{r-a}{r^2}\hat{r}
I then went ahead and determined that spherical co-ordinates are the way to go, and that the E-field has only an \hat{r} component and put together
\vec{∇}\cdot\vec{E}=\frac{1}{r^2}\frac{∂(r^2\vec{E}(\vec{r}))}{∂r}=\frac{k}{r^2\epsilon_{0}}
By inspection I decided that I should have \vec{E}(\vec{r})=\frac{k}{\epsilon_{0}r}\hat{r}. While this does satisfy \vec{∇}\cdot\vec{E}=\frac{\rho}{\epsilon_{0}}, it is not in agreement with the integral form.
So my question is, am I using these relationships incorrectly? I'm very confused, since when I integrate ∇E from a to r, I get the same answer as the Gauss integral form, but not sure why this could be, since I understand that the integral of ∇E is equal to the charge enclosed by a Gaussian surface. Since I'm just interested in the E-field at a point in the shell, why would I need to integrate anyway?