Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Least sqaures - cubic?

  1. Nov 13, 2007 #1
    Hi.

    I know how to find the coefficients of a quadratic with a data set using least sqaures.
    I now would like to do with to find a polynomial to the 3rd and 4th term.

    Could someone help me with this?
    The data set I am using:
    (-4.5,0.7)
    (-3.2,2.3)
    (-1.4,3.8)
    (0.8,5)
    (2.5,5.5)
    (4.1,5.6)

    Thanks
     
  2. jcsd
  3. Nov 13, 2007 #2

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    For a general polynomial interpolating function f of degree D, you have

    [tex]f(x) = \sum_{k=0}^D a_k x^k[/tex]

    Now, for your N data points [itex](x_i, y_i)[/itex], you want to minimize the squares of the difference between your interpolating function and the true [itex]y_i[/itex] values; i.e., you want to minimize the following:

    [tex]S = \sum_{i=1}^N(f(x_i) - y_i)^2[/tex]

    Now, f(x) can be considered to be a function of the parameters [itex]a_k[/itex], and thus S can also be considered to be a function of [itex]a_k[/itex]. We want to find the [itex]a_k[/itex] that minimize S. We note that S is positive and quadratic in each of the [itex]a_k[/itex], so we know that we can minimize S by setting all of its first partial derivatives equal to zero:

    [tex]{\partial S \over \partial a_n} = 0 \text{ for all n}[/tex]

    Expanding the partial derivative:

    [tex]{\partial S \over \partial a_n} = {\partial \over \partial a_n} \left[ \sum_{i=1}^N(f(x_i) - y_i)^2 \right] = \sum_{i=1}^N {\partial \over \partial a_n} \left[(f(x_i) - y_i)^2\right] = \sum_{i=1}^N\left[2(f(x_i) - y_i)\left {\partial f \over \partial a_n}\right|_{x_i}\right] = 0[/tex]

    We note that for a particular n,

    [tex]{\partial f \over \partial a_n} = x^n[/tex]

    And so,

    [tex]{\partial S \over \partial a_n} = \sum_{i=1}^N\left[2(f(x_i) - y_i)\left {\partial f \over \partial a_n}\right|_{x_i}\right] = \sum_{i=1}^N\left[ 2\left(\sum_{k=0}^D a_k x_i^k - y_i\right)x_i^n \right] = 0[/tex]

    If you expand out this sum, you will obtain D+1 linear equations for each of the D+1 [itex]a_k[/itex] (i.e., if D=3 for a 3rd-degree polynomial, then you'll get 4 equations for your 4 unknown polynomial coefficients). Then you just have to solve this system of linear equations.
     
    Last edited: Nov 13, 2007
  4. Nov 13, 2007 #3
    Wow, it has been a while sine I have done my A-Level Statistics.
    Could you help me up to the point of solving the 4 linear equations?

    Could you show me an example / point to a link that shows how to get to the point of having the 4 linear equations?

    Thanks
     
  5. Nov 13, 2007 #4

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    For a third-degree polynomial, these are your equations:

    [tex]\sum_{i=1}^N 2 \left( \sum_{k=0}^3 a_kx_i^k - y_i \right) = 0[/tex]

    [tex]\sum_{i=1}^N \left[ 2 \left( \sum_{k=0}^3 a_kx_i^k - y_i \right) x_i \right] = 0[/tex]

    [tex]\sum_{i=1}^N \left[ 2 \left( \sum_{k=0}^3 a_kx_i^k - y_i \right) x_i^2 \right] = 0[/tex]

    [tex]\sum_{i=1}^N \left[ 2 \left( \sum_{k=0}^3 a_kx_i^k - y_i \right) x_i^3 \right] = 0[/tex]

    Now, merely expand out each sum for your N data points, plug in each of your [itex]x_i, y_i[/itex], and solve for [itex]a_0, a_1, a_2, a_3[/itex]. Then your polynomial will be given by

    [tex]f(x) = a_3 x^3 + a_2 x^2 + a_1 x + a_0[/tex]

    Note that you can get rid of the extra factors of 2 by dividing them out of each equation. :P
     
  6. Nov 15, 2007 #5
    Hi, I am really sorry. I am not following.
    I don't quite understand the equations.

    Could you put my given data set in to one of these 4 equations ans show me the working?

    Thanks
     
  7. Nov 20, 2007 #6
    Can anyone help me with this?
     
  8. Nov 20, 2007 #7

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    All you do is plug in numbers. I haven't done it for you, because it's tedious. What part of it don't you understand?
     
  9. Nov 21, 2007 #8
    I just don't understand how I plug the numbers in.
    I don't remember how the sigma sign works.

    Could you show me how you get to the first linear equation?
     
  10. Nov 21, 2007 #9

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    [tex]\sum_{k=M}^N g(k)[/tex]

    means that you take the sum of g(k) for all k between M and N:

    [tex]g(M) + g(M+1) + g(M+2) + \ldots + g(N-1) + g(N)[/tex]
     
  11. Nov 21, 2007 #10

    uart

    User Avatar
    Science Advisor

    Maybe the double summation is confusing you Ian. The equations might look a little simpler if you note that the inner sum is nothing more than the polynomial evaluated at x_i.

    That is,

    [tex]P(x_i) = \sum_{k=0}^3 a_kx_i^k[/tex]

    If you make this subsitution you can think of it as simply forming the set of four linear simulataneous equations given by :

    [tex]\sum_{i=1}^N x_i^j \left(P(x_i) - y_i \right) = 0[/tex]

    for each j in 0..3
     
    Last edited: Nov 21, 2007
  12. Nov 21, 2007 #11
    Sorry, I am really confused by this - without any examples with data sets I just can't get my head round it.
     
  13. Nov 21, 2007 #12

    rcgldr

    User Avatar
    Homework Helper

    You can avoid having to solve linear equations by using orthogonal polynomials.

    Link to a rich text document with sample code:

    opls.rtf
     
  14. Nov 22, 2007 #13

    uart

    User Avatar
    Science Advisor

    Well this is pretty tedious so I don't think anyone is too keen to write out a big long numerical example, but heres a simplified example for D=1 and N=3 (first order polynomail and only three data points). Hopefully it will help you understand how to apply the above equations in general.

    Take the three points as.
    (1.0, 2.0)
    (2.0, 3.1)
    (3.0, 3.9).


    We are using a first order fit so, P(x) = ax + b and we need to determine parameters "a" and "b".

    So we'll have 2 linear simult equation which we get from writing,

    [tex]\sum_{i=1}^N x_i^j \left(P(x_i) - y_i \right) = 0[/tex]

    for each j in 0..D, (remember that D=1 in this example because it's first order) so that is one for each of j=0 and j=1.

    Eqn 1. (j=0) :

    (a*1 + b - 2) + (a*2 + b -3.1) + (a*3 + b -3.9) = 0

    Which simplifies to : 6a + 3b = 9


    Eqn 2. (j=1)

    1(a*1 + b - 2) + 2(a*2 + b -3.1) + 3(a*3 + b -3.9) = 0

    Which simplifies to : 14a + 6b = 19.9


    So you find the straight line of best fit (first order polynimial) by solving the those two simult equations.

    6a + 3b = 9
    14a + 6b = 19.9


    These give a=0.95 and b=1.1, so the equation is y = 0.95x + 1.1

    BTW. Do you have access to any programs like Matlab or Octave (freeware) to help manilpute large amounts of data. You can make this process very easy, even for higher order examples, with such programs.
     
    Last edited: Nov 22, 2007
  15. Nov 22, 2007 #14
    So if i were doing this for 2nd degree, where would my C coefficient come from?
     
  16. Nov 22, 2007 #15

    uart

    User Avatar
    Science Advisor

    It would come quite naturally when you write P(x) for a given x during the above procedure.

    When you write P(x) with known values of the coefficients but with x as the variable then it's a polynomial (quadratic cubic etc) in x. However when you follow the above procedure you are writing P(x) with unknown coefficients but a given value of x, in this case it is nothing other than a linear function of the coefficients! The solution above shows you how to set up the correct number of these linear equations in order to solve for the unknown coeficients. Just bite the bullet and have a try, I'm sure it will start to make sense once you start doing it.
     
    Last edited: Nov 22, 2007
  17. Nov 24, 2007 #16
    The above example makes perfect sense for determining a straight regression line.
    I can't see how to expand this to get a quadratic or even 3rd or 4th degree polynomial regression line.
     
  18. Nov 24, 2007 #17
    An example I found:
    Example
    Find the least squares quadratic of
    best fit for the following data.
    x 1 2 3 4
    y 1 2 2 2

    Sum of n = 4
    Sum of x = 10
    Sum of x^2 = 30
    Sum of x^3 = 100
    Sum of x^4 = 354
    Sum of y = 7
    Sum of xy = 19
    Sum of x^2 = 59

    so..

    4a + 10b + 30c =7
    10a + 30b + 100c = 19
    30a + 100b + 354c = 59.

    solving gives:
    y= a+bx+cx^2
    y = -0.25 + 1.55x - 0.25x^2

    Can anyone help me deduce how I would be able to get values for d and e - so that I can have equations in the form:
    y= a+bx+cx^2+dx^3
    y= a+bx+cx^2+dx^3+ex^4
    Can
     
  19. Nov 26, 2007 #18

    rcgldr

    User Avatar
    Homework Helper

    You have 4 values, which will determine a cubic exactly, just plug in the 4 sets of values for x and y,then solve the 4 linear equation. You need more than 4 points for a quartic equation. I've alread posted a link to a document that includes an algorithm and code for implementing poly curve fitting based on orthogonal polynomials (as an intermediate step, a normal polynomial is generated at the end).
     
  20. Nov 26, 2007 #19

    Ben Niehoff

    User Avatar
    Science Advisor
    Gold Member

    I think I see the problem here...Ian has been taught a cookie-cutter method in class without any of the theory behind it, and rather predictably, he is not able to see how to extend the method to other situations.

    Ian, let's start with Uart's equation, because I think it is easier to read than mine:

    [tex]\sum_{i=1}^N x_i^j \left(P(x_i) - y_i \right) = 0[/tex]

    (Remember, this represents N different equations, one for each value of j from 0 to N-1).

    Now, watch as I multiply some things and rearrange terms:

    [tex]\sum_{i=1}^N \left(x_i^j P(x_i) - x_i^j y_i \right) = 0[/tex]

    [tex]\sum_{i=1}^N x_i^j P(x_i) - \sum_{i=1}^N x_i^j y_i = 0[/tex]

    [tex]\sum_{i=1}^N x_i^j P(x_i) = \sum_{i=1}^N x_i^j y_i[/tex]

    Now, recall that

    [tex]P(x_i) = \sum_{k=0}^3 a_kx_i^k[/tex]

    So, we can just make this substitution for P(x_i):

    [tex]\sum_{i=1}^N x_i^j \left( \sum_{k=0}^3 a_kx_i^k \right) = \sum_{i=1}^N x_i^j y_i[/tex]

    Now, doing some more algebra:

    [tex]\sum_{i=1}^N \sum_{k=0}^3 \left(x_i^j a_k x_i^k \right) = \sum_{i=1}^N x_i^j y_i[/tex]

    [tex]\sum_{i=1}^N \sum_{k=0}^3 \left(a_k x_i^{j+k} \right) = \sum_{i=1}^N x_i^j y_i[/tex]

    Finally, we can interchange the order of the summations on the left side, to get:

    [tex]\sum_{k=0}^3 \sum_{i=1}^N \left(a_k x_i^{j+k} \right) = \sum_{i=1}^N x_i^j y_i[/tex]

    [tex]\sum_{k=0}^3 \left(a_k \sum_{i=1}^N x_i^{j+k} \right) = \sum_{i=1}^N x_i^j y_i[/tex]

    This should be in a form similar to the equations you've been using for the linear and quadratic cases.
     
  21. Nov 28, 2007 #20

    hotvette

    User Avatar
    Homework Helper

    If I understand correctly, you want to fit a cubic polynomial to six data points [itex](x_i, y_i)[/itex]. The equation to be solved (for f) is [itex]A^TAf = A^Tg[/itex], where:

    [tex]y = a + bx + cx^2 + dx^3[/tex] is the fitting function.

    [tex]A = \left[\begin{matrix}1 & x_1 & x_1^2 & x_1^3 \\ 1 & x_2 & x_2^2 & x_2^3 \\ 1 & x_3 & x_3^2 & x_3^3 \\ 1 & x_4 & x_4^2 & x_4^3 \\ 1 & x_5 & x_5^2 & x_5^3 \\ 1 & x_6 & x_6^2 & x_6^3 \end{matrix}\right] \ \ \ f = \left[\begin{matrix}a \\ b \\ c \\ d \end{matrix}\right] \ \ \ g = \left[\begin{matrix}y_1 \\ y_2 \\ y_3 \\ y_4 \\ y_5 \\ y_6 \end{matrix}\right][/tex]
     
    Last edited: Nov 28, 2007
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?