First off, I'm sorry but I am LaTeX illiterate .

## Homework Statement

An infinitely long conducting cylindrical shell has an inner radius A, an outer radius B, and carries a non-uniform current density J= [2ar]sin(wt) where a and w are constant.

What is the magnitude of the induced electric field a distance r > B from the central axis of the cylindrical shell.

## Homework Equations

Ampere's Law
$\oint {B \cdot ds = \mu _0 I_C }$

$\oint {E \cdot ds = - \frac{d}{{dt}}} \Phi_B$

I know the electric and magnetic field as well as the curves are all vectors, but don't know how to throw them into latex.

## The Attempt at a Solution

I had a similar question like this on a final exam, and have to admit I was confused. I just don't want to bother my professor during Winter break with this question out of respect.

I think I know how to solve this mechanically, first I'd integrate the current density J with respect to r from radius A to radius B to solve for the total current making sure I include (2*pi*rdr) as a differential, then I'd throw this result into Ampere's Law to find the magnitude of the magnetic field. Then I'd have to multiple this by an area, take the time derivative, then divide out by the curve where the electric field is. This is all due to everything being symmetric.

My question is, since the field lines have to pass through a closed loop in order for there to be a flux through an area which produces an induced emf around that closed loop according to Faraday's Law. For this example isn't that area surrounded by the closed loop and the magnetic field from Ampere's Law in the same plane? How can there be flux if the field lines are parallel to the plane?

In other words, to find the flux I'd have to multiply the magnetic field by pi*r^2?. Is this, my original assumption, or neither correct?

Last edited: