Hi. Any help on this problem would be greatly appreciated. Here is the problem: A)Determine the magnetic field between two long straight wires at a point 0.500cm from the left wire when the two wires are 3.00cm apart in terms of the current I1 (left wire) when the other (right wire) carries I2=10.0A. Assume these currents are in the same direction and draw a picture. Use a + to indicate into the plane of the page, and a – to indicate out of the plane of the page. B)Repeat, but with the currents in opposite directions and draw a picture. C)If I1 =15.0A, what is the value obtained for magnetic field in part (A)? What is the value obtained for magnetic field in part (B)? D) At what position will the magnetic field from (C) be equal to zero? I was using the equation BT=(?0I1)/(2?r1) + (?0I2)/(2?r2) For part B would it be BT=(?0I1)/(2?r1) - (?0I2)/(2?r2) since the currents are in opposite directions or am I mixing up what part should be added and which one should be subtracted? I also always get confused when figuring out if its into the plane of the paper or out of the plane of the paper. Is there an easy way to remember that? And I don’t have a clue how to figure out part D. Do I use the same equation? If so how do I manipulate it since there are 2 wires(radius)? Thank you so much to anyone who is willing to help. I really appreciate it.