Ax + b Least Squares Minimization Standard Form

In summary, the speaker is trying to use the simplex method to minimize the MSPE equation for a set of data and a regression equation. They are unsure of how to organize the problem and convert the equations into constraints. They also mention their end goal of turning it into a program and mention alternative methods for minimizing nonlinear equations.
  • #1
cook11
2
0
All -

Given a set of data {(xi, yi)| i = 1,2,...,m} and the regression equation f(x) = ax + b, I want to use the simplex method to minimize the equation Sigma [(yi - f(xi))/f(xi)]^2. However, I am stuck on how to initially organize the problem. I am not sure whether the equation, Sigma [(yi - f(xi))/f(xi)]^2, needs to be put into some sort of standard form or not. Also, I am having trouble comprehending how to turn the individual [(y - f(x))/f(x)]^2 equations into constraints.

The end goal for this is to turn it into a program. The simplex method should run sufficiently fast for the type of data I will be feeding the program. However, I first need to understand the logic before any code gets written.

Let me know if I'm not stating the problem clear enough. Thank you for helping.
 
Physics news on Phys.org
  • #2
You are trying to minimize

[tex]
S(a,b) = \sum_{i=1}^m {\left(\frac{y_i - (a + bx_i)}{a+bx_i}\right)^2}
[/tex]

is this correct? I'm not sure what this will give you - certainly not a regression result.

I'm not aware of any way to use the simplex method - designed for linear optimization problems, for a problem that is so far from being linear. (L1 regression problems can be solved as linear programming problems, but that is the closest item I know of.)
 
  • #3
statdad -

Correct. That's the right equation. It's the Minimum Squares Percent Error (MSPE) equation used in econometrics. I figured out there are a handful of algorithms/methods to minimize nonlinear equations. Variations of Newton's Method are fairly popular for this task. Thank you for your input!
 

1. What is the Ax + b Least Squares Minimization Standard Form?

The Ax + b Least Squares Minimization Standard Form is a mathematical technique used to find the best-fit line for a set of data points. This form is often used in regression analysis and is based on the principle of minimizing the sum of the squared differences between the actual data points and the predicted values on the line. It is expressed as y = ax + b, where a is the slope of the line and b is the y-intercept.

2. How is the least squares minimization method used in this form?

The least squares minimization method is used to calculate the values of a and b in the Ax + b Least Squares Minimization Standard Form. It involves finding the values of a and b that minimize the sum of the squared residuals, or the differences between the actual data points and the predicted values on the line. This method is commonly used to find the best-fit line for a set of data points, as it results in the line that is closest to all of the data points.

3. What types of data sets is this form suitable for?

The Ax + b Least Squares Minimization Standard Form is suitable for data sets that have a linear relationship between the independent variable, x, and the dependent variable, y. This means that the data points can be approximately plotted on a straight line, and the form can be used to find the best-fit line for these data points. It is commonly used in fields such as economics, finance, and engineering.

4. Are there any limitations to using this form?

One limitation of the Ax + b Least Squares Minimization Standard Form is that it assumes a linear relationship between the variables. This means that it may not be suitable for data sets with complex or nonlinear relationships. Additionally, it is important to note that the best-fit line found using this form may not accurately represent the data points if there are outliers or influential data points in the set.

5. How is this form different from other regression techniques?

The Ax + b Least Squares Minimization Standard Form is just one type of regression technique used to find the best-fit line for a set of data points. It differs from other techniques in that it specifically minimizes the sum of squared residuals, while other techniques may use different methods to determine the best-fit line. Additionally, this form is suitable for linear data sets, while other techniques may be more appropriate for different types of data, such as quadratic or exponential relationships.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
468
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
359
  • Advanced Physics Homework Help
Replies
0
Views
232
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Advanced Physics Homework Help
Replies
0
Views
551
  • Linear and Abstract Algebra
Replies
2
Views
1K
Back
Top