1. The problem statement, all variables and given/known data I need to show that the first equation becomes the last. First sub in for r1 and r2 and then use the binomial theorem to expand to first order in d. Then use the assumption that d<<r1 and r2. To show it reduces to the last eqn **In the first eqn it is NOT r3 is should be r2 sorry** http://img43.imageshack.us/img43/9427/fffgb.gif [Broken] 2. Relevant equations Binomial theorem I forgot to say d is the abs value of r2 -r1 3. The attempt at a solution I know how to apply the binomial theorem. My first instinct was to plug the vectors r1 and r2 into the first equation. Then expand the denominators using the binomial theorem. But the denominators are the magnitude cubed so I am not sure that makes sense. I dont want anyone to just give me the answer because I want to figure it out. But I need a hint on how get this into a form where i can apply the binomial theorem and the ignore the higher order terms of d because it is very small.