Please don't do the problem for me, but explain to me in detail each steps I should take and why in my calculations (so I didn't include any numbers).
There's an insulating infinite cylinder of equal charge distribution at origin (density given- rho). Let's say it's radius is a. Then some radius outwards, there's another charged conducting cylindrical shell (given linear charge density- lambda) that encloses the original cylinder with some thickness. It's inner surface is at a distance (radius) b from the origin, and its outer surface c from the origin.
I basically need to find V(c) - V(a), the pot. diff at surface of insulator, to outer surface of the shell.
V(r) = V(r) - V(inf) = INT[E.dl] from inf to r.
E = lamda/ (2pi e0 r)
Area of circle: pi r^2
The Attempt at a Solution
So the potential specifically for inf. cylinders should be something like:
V(r) = lamda/ (2pi e0) INT[1/r dr] from inf to r.
Since rho is useless, I'll have to change it into lambda, by multiplying rho by the area of the cross-sectional circle of the cylinder, (pi a^2). Then we can now view the insulating cylinder as a thin wire at the origin (along z-axis) with lambda = rho*pi*a^2.
to find V(c)-V(a), we must do multiple integrals.
INT[E.dl]<from inf to c>
INT[E.dl]<from inf to c> + INT[E.dl]<from c to b> + INT[E.dl]<from b to a>.
INT[E.dl]<from c to b> should be 0 because the E-filed within the conductor is 0.
so V(c)-V(a) = -INT[E.dl]<from b to a>.
= -lamda/ (2pi e0) INT[1/r dr] from b to a, where lambda as we found is rho*pi*a^2.
= -(rho*a^2)/(2 e0) * (ln(a)-ln(b))
...but apparently I'm wrong D: What did I do wrong, how should I have done it and why??