- #1
downtownjapan
- 6
- 0
Hi everyone,
I am trying to calculate the decrease in magnetic field strength over distance, using a formula I found on this thread https://www.physicsforums.com/showthread.php?t=522223
The formula given was 1/r^3, where r is the distance from the source.
The same thread says the formula "gives you the magnetic field in Tesla, if you plug in the current in Ampere, the length in meter"
I want to calculate how the field of an electromagnet decreases 0.5 mm directly above, at a 90 degree angle, the center of the one end of the iron core of the electromagnet (Point A shown in the image below).
BUT I must be doing something very, very wrong following this formula because the numbers I get don't make sense to me. I have clearly made a mistake applying the formula and I am wondering if someone could tell me where I have gone wrong.
I am trying to calculate how a field of 0.005 Tesla will decrease at a distance of 0.0005 meters away.
I have done the following calculation
0.005T/0.0005^3 meters, which gives me the number
0.005/0.0005^3 = 40,000,000
I have assumed the unit of the answer (when calculated correctly!) will be in Tesla.
Setting aside the obvious implausibility of a 40 million Tesla magnetic field, the result is always a higher number than I started with (0.005T), but I am trying to calculate how the field strength decreases.
Obviously I have gone horribly wrong, and I have a feeling I am making a very basic mistake somewhere with either the calculations and/or the understanding of the formula.
If any kind-hearted charitable soul out there wants to tell me how this poor fool has gone wrong, I would be very grateful!
I am trying to calculate the decrease in magnetic field strength over distance, using a formula I found on this thread https://www.physicsforums.com/showthread.php?t=522223
The formula given was 1/r^3, where r is the distance from the source.
The same thread says the formula "gives you the magnetic field in Tesla, if you plug in the current in Ampere, the length in meter"
I want to calculate how the field of an electromagnet decreases 0.5 mm directly above, at a 90 degree angle, the center of the one end of the iron core of the electromagnet (Point A shown in the image below).
BUT I must be doing something very, very wrong following this formula because the numbers I get don't make sense to me. I have clearly made a mistake applying the formula and I am wondering if someone could tell me where I have gone wrong.
I am trying to calculate how a field of 0.005 Tesla will decrease at a distance of 0.0005 meters away.
I have done the following calculation
0.005T/0.0005^3 meters, which gives me the number
0.005/0.0005^3 = 40,000,000
I have assumed the unit of the answer (when calculated correctly!) will be in Tesla.
Setting aside the obvious implausibility of a 40 million Tesla magnetic field, the result is always a higher number than I started with (0.005T), but I am trying to calculate how the field strength decreases.
Obviously I have gone horribly wrong, and I have a feeling I am making a very basic mistake somewhere with either the calculations and/or the understanding of the formula.
If any kind-hearted charitable soul out there wants to tell me how this poor fool has gone wrong, I would be very grateful!