1. The problem statement, all variables and given/known data Eric looks at his compass to check the direction to north. At a point two (2.0) meters below Eric, a cable is drawn from south to north and the current running through the cable is 50 Amperes. This current makes the compass needle turn wrong. How big is the angle deviation if the horizontal component of earth's magnetic field has the value 15*10^-6 Tesla? 2. Relevant equations We know the distance between the compass and the cable, since I don't know which formula that can be applied yet I will just call it a = 2.0 meters. Current, I = 50 A. Horizontal component of magnetic field from earth, 15*10^-6 Tesla. Some equations that could help: B = k * I/a where k is a constant 2.0*10^-7 Tm/A. 3. The attempt at a solution He is standing directly above the cable which points to north. We know the horizontal component of the magnetic field from earth, but wouldn't that mean the compass needle points to the right or left (since it is a horizontal component) and therefore he isn't facing north? Let's forget what I just said and still try do attempt at some sort of solution. Using the formula we have, we'd get that B = k * I/a. So => 2.0*10^-7 * (50/2) = 2.0*10^-7 * 25 = 5*10^-6. So the vertical component is that. Now to calculate the angle between the vertical component and the resulting vector ("hypothenuse") we use tan^-1 ((15*10^-6)/(5*10^-6)) and I get 71.57 degrees. So the total difference caused by the cable is 71.57 degrees. I strongly doubt this, since a lot had to be assumed. Where did I go wrong?