# Electric field of a sheet of charge

1. Apr 4, 2009

### bigevil

1. The problem statement, all variables and given/known data

(This is a truncated question.)

The electric field of a circular sheet of charge of radius a and surface charge density sigma and distance x away from the centre of the sheet is

$$E = \frac{\sigma}{2 \epsilon_0} [1 - \frac{x}{\sqrt{x^2 + a^2}}]$$

Prove that for x > 0

$$E = \frac{\sigma}{2\epsilon_0}$$ when x << a
$$E = \frac{Q}{4\pi \epsilon_0 x^2}$$ when x >> a

The sheet resembles an infinite sheet and a point charge in each case and I'm required to prove this mathematically.

3. The attempt at a solution

For the first case, I note that for x << a, x/a approaches 0. I factor out x from the square roots to get the answer required.

However, for the second case, I try the same thing, x >> a, now a/x approaches 0, but in this case the expression of E becomes E = 0. I've tried several methods and obtained the same thing. Someone help...

2. Apr 4, 2009

### rl.bhat

You can write E = sigma/2epsilon(not)[sqrt(x^2+a^2) - x]/sqrt(x^2+a^2)]
Multiply and dived [sqrt(x^2+a^2) + x] and simplify. Neglect the term a/x.