Ax + b Least Squares Minimization Standard Form

Click For Summary
SUMMARY

The discussion focuses on using the simplex method to minimize the Minimum Squares Percent Error (MSPE) equation, S(a,b) = Σ[(yi - (a + bxi))/(a + bxi)]^2, for a given dataset. Participants clarify that the simplex method is primarily designed for linear optimization problems and may not be suitable for this nonlinear regression task. Alternative methods, such as variations of Newton's Method, are recommended for minimizing nonlinear equations. The goal is to eventually implement a program that effectively utilizes these optimization techniques.

PREREQUISITES
  • Understanding of regression analysis and the MSPE equation.
  • Familiarity with the simplex method and its applications in linear optimization.
  • Knowledge of nonlinear optimization techniques, particularly Newton's Method.
  • Basic programming skills for implementing optimization algorithms.
NEXT STEPS
  • Research nonlinear optimization methods, focusing on variations of Newton's Method.
  • Explore the limitations of the simplex method in nonlinear contexts.
  • Learn about alternative algorithms for minimizing nonlinear equations, such as gradient descent.
  • Investigate programming libraries that facilitate optimization, such as SciPy in Python.
USEFUL FOR

Data scientists, statisticians, and software developers interested in regression analysis and optimization techniques for nonlinear equations.

cook11
Messages
2
Reaction score
0
All -

Given a set of data {(xi, yi)| i = 1,2,...,m} and the regression equation f(x) = ax + b, I want to use the simplex method to minimize the equation Sigma [(yi - f(xi))/f(xi)]^2. However, I am stuck on how to initially organize the problem. I am not sure whether the equation, Sigma [(yi - f(xi))/f(xi)]^2, needs to be put into some sort of standard form or not. Also, I am having trouble comprehending how to turn the individual [(y - f(x))/f(x)]^2 equations into constraints.

The end goal for this is to turn it into a program. The simplex method should run sufficiently fast for the type of data I will be feeding the program. However, I first need to understand the logic before any code gets written.

Let me know if I'm not stating the problem clear enough. Thank you for helping.
 
Physics news on Phys.org
You are trying to minimize

<br /> S(a,b) = \sum_{i=1}^m {\left(\frac{y_i - (a + bx_i)}{a+bx_i}\right)^2}<br />

is this correct? I'm not sure what this will give you - certainly not a regression result.

I'm not aware of any way to use the simplex method - designed for linear optimization problems, for a problem that is so far from being linear. (L1 regression problems can be solved as linear programming problems, but that is the closest item I know of.)
 
statdad -

Correct. That's the right equation. It's the Minimum Squares Percent Error (MSPE) equation used in econometrics. I figured out there are a handful of algorithms/methods to minimize nonlinear equations. Variations of Newton's Method are fairly popular for this task. Thank you for your input!
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
3K