Has anyone read the book by Daniel Fleisch, 'A Student's Guide to Maxwell's Equations'? I'm having some trouble with Chapter 1, page 36. He's talking about the divergence of an electric field originating from a point charge. Apparently, the divergence of the vector electric field is zero, because the spreading out of the field lines (as they get further away from the origin) is compensated for by the 1/R^2 reduction in the amplitude of the field. I don't really understand this? When I picture it, the field lines spread out so it must be diverging? How does the decreased amplitude help here?