Modified least squares fitting?

In summary, the conversation revolves around a problem with curve fitting or regression, specifically in minimizing the variance of one variable compared to others. The individual is using a least squares algorithm in MATLAB and is wondering how to modify the design matrix to incorporate weighting factors for the desired variable.
  • #1
nmf77
13
0
Hi, I wonder if someone could give me some guidance on this problem please. I'm not a mathematician and I'm not even sure what the title of this problem should be - curve fitting, regression, function minimization?

It started with something fairly simple, least squares fitting with 3 variables and 6 or more equations. I know how to cast that problem in matrix form and get the answer, but now I need to do something slightly more difficult. If I call the variables X, Y and Z, it turns out that X is much more important than the other two variables, so I want solution where the variance in X is smaller than the other variances. How do I do that? In my head I'm picturing the problem geometrically. In the standard least squares approach I see the cost/loss function being spherical i.e. the same in all directions. What I need is a cost function that is more like an ellipse, but I don't know how to do that. I suppose the least squares algorithm doesn't really implement a cost function as such - effectively there is a cost function, but it is an innate feature of the algorithm and so can't be modified. Perhaps I need a whole new approach? I hope that makes some sense to someone!

Any advice appreciated, many thanks
Nick
 
Mathematics news on Phys.org
  • #2
How are you implementing your least square algorithm? Did you write it from scratch or are you using prewritten code in MATLAB or excel or something?

Typically you just scale up the error in the variable that you care about - so if your prediction f(-) has error of x,y, and z in the X, Y and Z variable, you count that as Nx, y and z for some big number N and then run the rest of the algorithm using those errors.
 
  • #3
Hi, thanks for the reply. I'm using a Matlab script. I wrote the script but it's a straight lift from textbook least squares in matrix form.

Basically it's this;

observations are in vector b, A is the 'design matrix' and x is the vector of fitting variables.

So b = Ax + v (v contains the errors/residuals), and x is computed as

x = (inverse(A'*A)) * A' * b (prime = transpose)

I think what you're suggesting is that I manipulate the design matrix with some weighting factors -is that correct?
 

What is modified least squares fitting?

Modified least squares fitting is a statistical technique used to find the best fit line or curve for a given set of data points. It is a modified version of the traditional least squares method, where the errors are minimized by adjusting the parameters of the model being fit. This method is commonly used in data analysis and regression analysis to estimate the relationship between two or more variables.

How is modified least squares fitting different from traditional least squares fitting?

In traditional least squares fitting, the sum of squared errors is minimized by adjusting the parameters of the model. However, in modified least squares fitting, other factors such as the number of parameters, the complexity of the model, and the amount of available data are taken into consideration. This results in a more accurate and reliable fit for the given data.

What are the advantages of using modified least squares fitting?

Modified least squares fitting has several advantages over traditional least squares fitting. It takes into account the complexity of the model and the amount of available data, resulting in a more accurate and reliable fit. It also allows for the inclusion of prior knowledge or assumptions about the relationship between the variables being studied. Additionally, it can handle outliers and missing data more effectively than traditional least squares fitting.

What are the limitations of modified least squares fitting?

Like any statistical method, modified least squares fitting also has its limitations. It requires a large amount of data to produce reliable results, which can be a challenge in some cases. It also assumes that the data follows a specific model, which may not always be the case. Additionally, the method can be computationally intensive, especially for complex models with a large number of parameters.

When should modified least squares fitting be used?

Modified least squares fitting is a versatile method that can be applied in various fields, including science, engineering, and finance. It is useful when there is a need to estimate the relationship between two or more variables and when the data is not normally distributed. It is also helpful when there are known biases or outliers in the data that need to be accounted for. Overall, modified least squares fitting should be used when a more accurate and reliable fit is desired.

Similar threads

Replies
7
Views
472
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
364
  • General Math
Replies
19
Views
5K
  • General Math
Replies
2
Views
686
  • Calculus and Beyond Homework Help
Replies
6
Views
600
Replies
1
Views
688
Replies
10
Views
3K
Replies
1
Views
1K
Back
Top