# Using SVD to determine the redundancy of a fit.

WarPhalange
I have a project I am doing for a professor and unfortunately I cannot get ahold of him to help me out, so I figured I'd ask you guys. Of course, I tried to ask The Google about this first and didn't get anywhere. Here is what I am trying to do:

assignment said:
Linear least squares fitting. Choose some odd-ball function, say g(x). Create a set of "data" by choosing xi and some (sigma)i, generating yi normally distributed about g(xi). Choose a set of functions that might plausibly fit the data as y=Sum[aj fj(x)]. Perform a least squares fit by solving the normal equations in matrix form. You should determine the condition number and use SVD to determine whether there is any redundancy in your choice of fj(x) and fix the fit. Finally should evaluate chi squared to see whether the fit is adequate.

The part I am having trouble with is using SVD to determine redundancy. My f(x) is an n'th order polynomial (I get to decide what 'n' is). I successfully found a fit to my generated data, but don't know where to go from there.

What I found online was to take the matrix A from Ax=b, do SVD on that, and go from there in order to solve the system of equations. I already have a solution though, so I don't know what to do.

One idea I had was to make a new matrix like so:

| a0 a1x1 a2x12 a3x13 |
| a0 a1x2 a2x22 a3x23 |
| a0 a1x3 a2x32 a3x33 |

et cetera, with actual numbers instead of 'a' and 'x' of course, take the SVD of that, trim down the three matrices to only include actual eigenvalues (so drop any '0' elements in the diagonal matrix), transform back, and then divide by the various 'x's to get different values for the 'a's, but I'm not sure if that would do anything at all.

Thanks in advance for the help.