1. The problem statement, all variables and given/known data Consider the following pair of dipoles in one dimension. One of the dipoles: -ve dipole positioned at (-D/2, 0) and +ve (+D/2, 0), with the origin between them. Other dipole: -ve dipole positioned at (r-d/2, 0) and +ve at (r+d/2, 0), with (r, 0) being the centre. Show that the total electrostatic interaction energy between the dipoles, when r is much greater than both d and D, is given by, V = – [1/2(Pi)*(epsilon nought)](QD)(qd)/r^3. 2. Relevant equations 3. The attempt at a solution I have attempted this question by considering a point x between the dipoles and finding the Electrostatic potential caused by each of the ends of the dipole. This ended up with 4 terms in a summation and I'm struggling to see how the correct answer is a multiplication? I believe some of the terms can be taken as zero when they are squared (in the limit where r>>d and D) but I'm not sure how to get to this step. I would appreciate a push in the right direction, thanks in advance.