Regressions without calculator

  • Context: High School 
  • Thread starter Thread starter jmsdg7
  • Start date Start date
  • Tags Tags
    Calculator
Click For Summary
SUMMARY

This discussion focuses on performing regression analysis without a calculator, specifically in the context of AP Calculus. The participant seeks to understand how to derive linear equations from multiple points and higher-degree polynomials. Key methods mentioned include the Least Squares method and the matrix approach for calculating parameters a and b in the linear model y = a + b x + u. The discussion emphasizes the importance of minimizing the difference between the observed values and the model predictions using the formula ∑[f(x) - q(x)]².

PREREQUISITES
  • Understanding of linear equations and their derivation from two points
  • Familiarity with polynomial degrees: linear, quadratic, cubic, and quartic
  • Knowledge of the Least Squares method for regression analysis
  • Basic matrix operations and concepts in linear algebra
NEXT STEPS
  • Study the Least Squares method in detail for multiple regression
  • Learn about polynomial regression and its applications
  • Explore matrix algebra, specifically the calculation of inverses and determinants
  • Investigate statistical software tools like R or Python's NumPy for regression analysis
USEFUL FOR

High school students studying AP Calculus, educators teaching regression analysis, and anyone interested in understanding the mathematical foundations of regression without relying on calculators.

jmsdg7
Messages
3
Reaction score
0
im a senior in high school taking ap calculus, I've done regressions here and there with the calculator but i was wondering how do do it without the calculator, i obviously know how to get a linear equation out of 2 points, but how is it done with more points and higher degrees?

i understand that you need 2 points for a line, 3 for a quadradic, 4 for a cubic and 5 for a quartic but i was hoping someone could show me how it is done.:confused:
 
Physics news on Phys.org
The simplest method i know is the Least Squares, there are many others.

I can show you the general form for any number of points and any type of equation, but it takes a little hard work and algebra.

In short therms, you have to find the parameters that minimize the difference \sum [f(x)-q(x)]^2, where q(x) is the function you want to minimize, and f(x) are the points you are given.

I'll try to post something more elaborate this week.
 
Let the model be y = a + b x + u. Parameters of the model are a and b, u is the error term.

Variables y, x (and u) are each N-by-1 vectors.

Let X = [1 x] be the N-by-2 matrix. The first column of X is a vector of 1's. The second column of X is identical to vector x.

Let Z be the inverse of X'X. Z is a 2-by-2 matrix.

Then we can write [a b]' = ZX'y, which is 2-by-1. Parameter a is the first (top) element of ZX'y. Parameter b is the second element of ZX'y.

For higher order polynomials, substitute [x x^2 x^3 ...] for x, and [b1 b2 b3 ...] for b.
 
Of course, for simple regression, the matrix approach mentioned above is equivalent to the following equations:
The slope is given by

<br /> b = \frac{\sum{(x-\bar x)(y - \bar y)}}{\sum (x-\bar x)^2}<br />

and the y-intercept is given by

<br /> a = \bar y - b \bar x<br />
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 23 ·
Replies
23
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 64 ·
3
Replies
64
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
26
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
7K
  • · Replies 6 ·
Replies
6
Views
3K