I'm working my way through MIT 8.02x on EdX (an archived course, so it's a bit lonely in there right now!). The problem statement: Two spherical conductors, A and B, are placed in vacuum. A has a radius rA=25 cm and B of rB=35 cm. The distance between the centers of the two spheres is d=225 cm. A has a potential of VA=100 Volt and B has a potential of VB= -25 Volt. An electron is released with zero speed from B. What will its speed be as it reaches A? I got the approved answer simply by assuming the change in energy equals the charge on the electron times the potential difference. But is it really that simple? Each sphere should have uneven charge distribution because of induction. The field on the surface of sphere A closest to sphere B should be stronger than on the other side of sphere A, and the same applies to sphere B. Doesn't that affect the analysis?