Quadratic Regression calculation

Click For Summary
SUMMARY

The discussion focuses on calculating quadratic regression by hand, specifically finding a parabola of the form f(x) = ax² + bx + c that minimizes total square errors for a given dataset (x, y). Participants emphasize the importance of understanding the derivation of linear least squares regression formulas, which involves minimizing the function G(A, B) = Σ(Axᵢ + B - yᵢ)². The quadratic regression follows a similar approach, requiring the minimization of a function G(A, B, C) involving three variables.

PREREQUISITES
  • Understanding of linear regression concepts and formulas
  • Familiarity with the method of least squares
  • Knowledge of partial derivatives and simultaneous equations
  • Ability to work with summation notation (Σ)
NEXT STEPS
  • Study the derivation of linear least squares regression formulas
  • Learn how to minimize functions of multiple variables in calculus
  • Explore the application of quadratic regression in statistical analysis
  • Practice calculating quadratic regression using sample datasets
USEFUL FOR

Students learning statistics, data analysts, and anyone interested in understanding regression analysis techniques.

pyfgcr
Messages
22
Reaction score
0
Hi, I'm learning statistic. Do you guys know how to calculate quadratic regression by hand, which is: give a data set (x,y), find a parabola f(x)=ax^2+bx+c that minimize the total square errors .
I have known how to calculate linear regression.
Thanks in advanced.
 
Physics news on Phys.org
Write out the function that is to be minimized and then write its derivative. You get a a system of simultaneous linear equations. If you cannot visualize this using the summation notation \Sigma then try making up 4 (x,y) data pairs and doing it.
 
A system of simultaneous linear equation: ax^2 + bx + c , derivative: 2ax + b ?
I don't really understand
 
pyfgcr said:
A system of simultaneous linear equation: ax^2 + bx + c , derivative: 2ax + b ?
I don't really understand

Perhaps you haven't studied how the formulas for linear least squares regression are derived.

In linear regression there are n data points {(x_1,y_2), (x_2,y_2),...(x_n,y_n) }. The function to be minimized is G(A,B) = \sum_{i=1}^n (A x_i + B - y_i)^2 and deriving the formulas involves taking the partial derivatives of G(A,B) with respect to each of A and B and setting them equal to zero to obtain two simultaneous linear equations. Look up how that is done.

The method for the quadratic is similar. It involves minimzing a function of 3 variables G(A,B,C).
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 64 ·
3
Replies
64
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K