- #1
- 2,813
- 492
I'm doing some line fitting on experimental data. Basically I have some array of pixels, and a value measured at each pixel, and I am fitting it with several constrained Gaussians. I'm using a Levenburg-Marquadt nonlinear least squares algorithm called mpfit to fit the parameters, but the results aren't so good due to the existence of background signals in the data.
I'm thinking I could do a better fit using a different "metric" for computing goodness of fit than sum(chi squared error). I want the difference between the model function and the data to be small over many pixels, but not necessarily all of the pixels. That is, I prefer a fit that matches 10 out of 30 pixels very well (but 20 pixels very poorly) over a fit that fits all 30 pixels in a mediocre way. Has anything like this been done before? I don't want to reinvent the wheel.
I'm thinking I could do a better fit using a different "metric" for computing goodness of fit than sum(chi squared error). I want the difference between the model function and the data to be small over many pixels, but not necessarily all of the pixels. That is, I prefer a fit that matches 10 out of 30 pixels very well (but 20 pixels very poorly) over a fit that fits all 30 pixels in a mediocre way. Has anything like this been done before? I don't want to reinvent the wheel.