Modified least squares fitting?

Click For Summary
SUMMARY

The discussion centers on modifying the least squares fitting approach to account for variable importance, specifically prioritizing the variable X over Y and Z. The user, Nick, seeks to adjust the cost function from a spherical shape to an elliptical one to reflect the variance in X being smaller than in the other variables. A suggested solution involves scaling the errors associated with X by a factor N, effectively manipulating the design matrix in MATLAB to achieve the desired weighting. This approach allows for a customized least squares fitting that aligns with the user's specific requirements.

PREREQUISITES
  • Understanding of least squares fitting and regression analysis
  • Familiarity with MATLAB scripting and matrix operations
  • Knowledge of design matrices and error/residual calculations
  • Concept of cost functions and their geometric interpretations
NEXT STEPS
  • Research how to implement weighted least squares in MATLAB
  • Learn about elliptical cost functions and their applications in regression
  • Explore advanced regression techniques such as Ridge or Lasso regression
  • Investigate the effects of variable scaling on regression outcomes
USEFUL FOR

Mathematicians, data analysts, and engineers involved in statistical modeling and regression analysis, particularly those looking to customize least squares fitting methods in MATLAB.

nmf77
Messages
13
Reaction score
0
Hi, I wonder if someone could give me some guidance on this problem please. I'm not a mathematician and I'm not even sure what the title of this problem should be - curve fitting, regression, function minimization?

It started with something fairly simple, least squares fitting with 3 variables and 6 or more equations. I know how to cast that problem in matrix form and get the answer, but now I need to do something slightly more difficult. If I call the variables X, Y and Z, it turns out that X is much more important than the other two variables, so I want solution where the variance in X is smaller than the other variances. How do I do that? In my head I'm picturing the problem geometrically. In the standard least squares approach I see the cost/loss function being spherical i.e. the same in all directions. What I need is a cost function that is more like an ellipse, but I don't know how to do that. I suppose the least squares algorithm doesn't really implement a cost function as such - effectively there is a cost function, but it is an innate feature of the algorithm and so can't be modified. Perhaps I need a whole new approach? I hope that makes some sense to someone!

Any advice appreciated, many thanks
Nick
 
Mathematics news on Phys.org
How are you implementing your least square algorithm? Did you write it from scratch or are you using prewritten code in MATLAB or excel or something?

Typically you just scale up the error in the variable that you care about - so if your prediction f(-) has error of x,y, and z in the X, Y and Z variable, you count that as Nx, y and z for some big number N and then run the rest of the algorithm using those errors.
 
Hi, thanks for the reply. I'm using a Matlab script. I wrote the script but it's a straight lift from textbook least squares in matrix form.

Basically it's this;

observations are in vector b, A is the 'design matrix' and x is the vector of fitting variables.

So b = Ax + v (v contains the errors/residuals), and x is computed as

x = (inverse(A'*A)) * A' * b (prime = transpose)

I think what you're suggesting is that I manipulate the design matrix with some weighting factors -is that correct?
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
7K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K