1. The problem statement, all variables and given/known data A wire with radius R= .01 m carries a current with uniform density. At a distance r1 =1.25m from the wire, the magnetic field has a magnitude of 2.55 μT. Calculate the magnitude of the magnetic field r2=.0065m from the wire. 2. Relevant equations B= (μ0I)/ 2πr 3. The attempt at a solution So first I did: B1=(μ0I)/( 2πr1) Solving for I: I = (B12πr1)/μ0 Plugging it into the B2 equation: B2=(μ0B12πr1)/(2πr2μ0) = (B1r1)/r2 =4.9 x 10^-4 T I'm not sure if I did this right since other people in my class got something different. Can someone corroborate this? Thanks.