There is a dipole moment that is symmetric along a y-axis and is along an x-axis. A charge -q is placed a distance a along the -x direction and a charge +q is placed a distance a along the +x direction, making the distance between the 2 charges 2a. I'm supposed to show that the electric field at a distant point on the +x axis is E ~ (4*Ke*q*a)/(x^3). I set the equation so that E = (-Ke*q)/(x+a) + (Ke*q)/(x-a) and ended up with E = (2*Ke*q*a)/(x^2 - a^2), which for a distant point may round to (2*Ke*q*a)/(x^2). The answer they're looking for is different. What am I doing wrong?