- #1
shimniok
- 2
- 0
Hope this is in the right forum. I apologize in advance for my ignorance and imprecise discussion as I am at a major disadvantage, lacking rich mathematical educational background enjoyed by most here.
Background is that I'm curious about calibrating for soft-iron distortion calibration for a magnetometer. If I take a number of readings from a 2-axis magnetometer, the locus of readings forms a rotated ellipse. My understanding from a Freescale publication (pdf) is that calibration involves scaling along the major axis of this ellipse. I believe the correct term is directional scaling?
I've done some reading on using a matrix for scaling along the x and y axes. I've done some extensive searching for any hint as to how one might scale along an arbitrary direction. Or are there better approaches?
Ultimately my thought is this. A theoretical, perfectly calibrated magnetometer will have a locus of measurements that is a perfect circle. So I wonder if I can use some kind of non-linear programming approach to finding a scaling amount that comes closest to a circle by summing squares of difference at each point. This idea itself may be flawed. But it seems to be at least vaguely similar to a technique written about in a paper (pdf) I found recently.
I don't necessarily need the answer spoon fed, but would greatly appreciate some pointers in the right direction so my searches will be more fruitful. Many thanks in advance.
Background is that I'm curious about calibrating for soft-iron distortion calibration for a magnetometer. If I take a number of readings from a 2-axis magnetometer, the locus of readings forms a rotated ellipse. My understanding from a Freescale publication (pdf) is that calibration involves scaling along the major axis of this ellipse. I believe the correct term is directional scaling?
I've done some reading on using a matrix for scaling along the x and y axes. I've done some extensive searching for any hint as to how one might scale along an arbitrary direction. Or are there better approaches?
Ultimately my thought is this. A theoretical, perfectly calibrated magnetometer will have a locus of measurements that is a perfect circle. So I wonder if I can use some kind of non-linear programming approach to finding a scaling amount that comes closest to a circle by summing squares of difference at each point. This idea itself may be flawed. But it seems to be at least vaguely similar to a technique written about in a paper (pdf) I found recently.
I don't necessarily need the answer spoon fed, but would greatly appreciate some pointers in the right direction so my searches will be more fruitful. Many thanks in advance.