1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Regressions without calculator

  1. Feb 9, 2010 #1
    im a senior in highschool taking ap calculus, ive done regressions here and there with the calculator but i was wondering how do do it without the calculator, i obviously know how to get a linear equation out of 2 points, but how is it done with more points and higher degrees?

    i understand that you need 2 points for a line, 3 for a quadradic, 4 for a cubic and 5 for a quartic but i was hoping someone could show me how it is done.:confused:
     
  2. jcsd
  3. Feb 9, 2010 #2
    The simplest method i know is the Least Squares, there are many others.

    I can show you the general form for any number of points and any type of equation, but it takes a little hard work and algebra.

    In short therms, you have to find the parameters that minimize the difference [tex]\sum [f(x)-q(x)]^2[/tex], where [tex]q(x)[/tex] is the function you want to minimize, and [tex]f(x)[/tex] are the points you are given.

    I'll try to post something more elaborate this week.
     
  4. Feb 9, 2010 #3

    EnumaElish

    User Avatar
    Science Advisor
    Homework Helper

    Let the model be y = a + b x + u. Parameters of the model are a and b, u is the error term.

    Variables y, x (and u) are each N-by-1 vectors.

    Let X = [1 x] be the N-by-2 matrix. The first column of X is a vector of 1's. The second column of X is identical to vector x.

    Let Z be the inverse of X'X. Z is a 2-by-2 matrix.

    Then we can write [a b]' = ZX'y, which is 2-by-1. Parameter a is the first (top) element of ZX'y. Parameter b is the second element of ZX'y.

    For higher order polynomials, substitute [x x^2 x^3 ...] for x, and [b1 b2 b3 ...] for b.
     
  5. Feb 10, 2010 #4

    statdad

    User Avatar
    Homework Helper

    Of course, for simple regression, the matrix approach mentioned above is equivalent to the following equations:
    The slope is given by

    [tex]
    b = \frac{\sum{(x-\bar x)(y - \bar y)}}{\sum (x-\bar x)^2}
    [/tex]

    and the y-intercept is given by

    [tex]
    a = \bar y - b \bar x
    [/tex]
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook