- #1

exmachina

- 44

- 0

## Homework Statement

Given some data set, (x,y), fit to the the curve [tex]y=bx^2+a[/tex] by minimizing the square of the deviation. Preferred to use matrices.

## Homework Equations

The deviation for the ith data point is simply:

[tex]d_i^2=(y_i-y)^2=(y-bx_i^2-a)^2[/tex]

## The Attempt at a Solution

If I understand correctly, I want to minimize [tex]\sum{d_i^2}[/tex] by differentiating wrt to a and b and setting to zero to find the local minima. So far I have differentiated to yield the following system of equations:

[tex]\sum_{i}^n y_i = an + \sum_{i}^n bx_i^2 [/tex]

[tex]\sum_{i}^n{x_i^2y_i}=a\sum_{i}^n x_i^2 + b \sum_{i}^n x_n^4[/tex]

So I'm stuck using either Cramer's Rule, LU, or GE. Cramer's Rule is the easiest to implement, but I don't know how much slower it will be compared to LU/GE. I have around 300 data points

Last edited: