How can I improve regression results by adjusting the goodness of fit metric?

In summary, the speaker is working on line fitting with experimental data using constrained Gaussians and the mpfit algorithm. However, they are not satisfied with the results due to background signals in the data. They are considering using a different metric for computing goodness of fit and prefer a fit that matches a smaller number of pixels very well over a mediocre fit for all pixels. They are wondering if this approach has been used before and later found a similar method called least trimmed squares.
  • #1
Khashishi
Science Advisor
2,813
492
I'm doing some line fitting on experimental data. Basically I have some array of pixels, and a value measured at each pixel, and I am fitting it with several constrained Gaussians. I'm using a Levenburg-Marquadt nonlinear least squares algorithm called mpfit to fit the parameters, but the results aren't so good due to the existence of background signals in the data.

I'm thinking I could do a better fit using a different "metric" for computing goodness of fit than sum(chi squared error). I want the difference between the model function and the data to be small over many pixels, but not necessarily all of the pixels. That is, I prefer a fit that matches 10 out of 30 pixels very well (but 20 pixels very poorly) over a fit that fits all 30 pixels in a mediocre way. Has anything like this been done before? I don't want to reinvent the wheel.
 
Mathematics news on Phys.org

FAQ: How can I improve regression results by adjusting the goodness of fit metric?

1. What is non-least squares regression?

Non-least squares regression is a type of regression analysis where the relationship between variables is not assumed to be linear. Instead, other types of relationships such as polynomial or exponential can be explored.

2. How is non-least squares regression different from least squares regression?

In least squares regression, the goal is to minimize the sum of squared errors between the observed data and the predicted values. Non-least squares regression, on the other hand, does not rely on this assumption and can handle more complex relationships between variables.

3. When should non-least squares regression be used?

Non-least squares regression should be used when the relationship between variables is not linear, or when there is a need to explore different types of relationships between variables. It can also be used when the data is not normally distributed.

4. What are some advantages of non-least squares regression?

Non-least squares regression allows for more flexibility in modeling relationships between variables. It can also handle non-normally distributed data and outliers better than least squares regression.

5. Are there any limitations to non-least squares regression?

One limitation of non-least squares regression is that it can be more computationally intensive than least squares regression. It also requires a larger sample size to produce reliable results. Additionally, the interpretation of results may be more complex with non-least squares regression compared to least squares regression.

Similar threads

Replies
11
Views
2K
Replies
4
Views
2K
Replies
13
Views
1K
Replies
5
Views
1K
Replies
4
Views
2K
Replies
12
Views
3K
Replies
7
Views
1K
Back
Top