Minimizing square of deviation / curve fitting

exmachina
Messages
42
Reaction score
0

Homework Statement



Given some data set, (x,y), fit to the the curve y=bx^2+a by minimizing the square of the deviation. Preferred to use matrices.

Homework Equations


The deviation for the ith data point is simply:

d_i^2=(y_i-y)^2=(y-bx_i^2-a)^2

The Attempt at a Solution



If I understand correctly, I want to minimize \sum{d_i^2} by differentiating wrt to a and b and setting to zero to find the local minima. So far I have differentiated to yield the following system of equations:

\sum_{i}^n y_i = an + \sum_{i}^n bx_i^2

\sum_{i}^n{x_i^2y_i}=a\sum_{i}^n x_i^2 + b \sum_{i}^n x_n^4So I'm stuck using either Cramer's Rule, LU, or GE. Cramer's Rule is the easiest to implement, but I don't know how much slower it will be compared to LU/GE. I have around 300 data points
 
Last edited:
Physics news on Phys.org
It doesn't really matter what method you use, there are only two unknowns, a and b. It's a 2x2 system. Once you have found the summations of the various powers of your data points, it's easy.
 
Solving the system using Cramer's rule is very simple.

a = \frac { \sum x^{4}_{i} \sum y_{i} - \sum x^{2}_{i} \sum x^{2}_{i} y_{i}} {n \sum x^{4}_{i }- ( \sum x^{2}_{i})^2}


b = \frac { n \sum x^{2}_{i} y_{i} - \sum x^{2}_{i} \sum y_{i}} {n \sum x^{4}_{i }- ( \sum x^{2}_{i})^2}

and then calculate the sums using a spreadsheet
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top