- #1

WarPhalange

assignment said:Linear least squares fitting. Choose some odd-ball function, say g(x). Create a set of "data" by choosing xi and some (sigma)i, generating yi normally distributed about g(xi). Choose a set of functions that might plausibly fit the data as y=Sum[aj fj(x)]. Perform a least squares fit by solving the normal equations in matrix form. You should determine the condition number and use SVD to determine whether there is any redundancy in your choice of fj(x) and fix the fit. Finally should evaluate chi squared to see whether the fit is adequate.

The part I am having trouble with is using SVD to determine redundancy. My f(x) is an n'th order polynomial (I get to decide what 'n' is). I successfully found

*a*fit to my generated data, but don't know where to go from there.

What I found online was to take the matrix A from Ax=b, do SVD on that, and go from there in order to solve the system of equations. I already have a solution though, so I don't know what to do.

One idea I had was to make a new matrix like so:

| a

_{0}a

_{1}x

_{1}a

_{2}x

_{1}

^{2}a

_{3}x

_{1}

^{3}|

| a

_{0}a

_{1}x

_{2}a

_{2}x

_{2}

^{2}a

_{3}x

_{2}

^{3}|

| a

_{0}a

_{1}x

_{3}a

_{2}x

_{3}

^{2}a

_{3}x

_{3}

^{3}|

et cetera, with actual numbers instead of 'a' and 'x' of course, take the SVD of

*that*, trim down the three matrices to only include actual eigenvalues (so drop any '0' elements in the diagonal matrix), transform back, and then divide by the various 'x's to get different values for the 'a's, but I'm not sure if that would do anything at all.

Thanks in advance for the help.