Linear polynomial least squares

AI Thread Summary
The discussion centers on constructing normal equations for linear polynomial least squares to fit given data points. Participants are tasked with finding regression parameters using QR decomposition, calculating eigenvalues from the normal equation matrix, and writing a quadratic interpolation polynomial in Lagrange form. There is also a query about the conditions leading to a zero eigenvalue in the normal equations matrix. The conversation highlights the importance of definitions and examples in understanding these concepts. Overall, the thread emphasizes the mathematical processes involved in linear regression and polynomial interpolation.
PhysicPhanatic
Messages
5
Reaction score
0
Construct the normal equations for the linear polynomial least squares to fit the data x = [1 0 -1], y=[3;2;-1]. (a) Find the parameters of the linear regression u1, u2 using QR decomposition, and plot the data and the fit curve in a graph (paper and pencil). (b) Calculate the eigenvalues of the normal equation matrix A'*A for the above data from the characteristic polynomial. (c) Write down quadratic interpolation polynomial in Lagrange form to interpolate the three data points. (d) Assume that the normal equation matrix A'*A is generated by A with m rows and n columns, m>=n. Explain under which conditions there would be a zero eigenvalue among the eigenvalues of the matrix of the normal equations.
 
Physics news on Phys.org
anyone have any idea on how to do this?
 
A more important question is if you have any ideas.
 
the answer is NO, anyone else?
 
Definitions are always a good place to start. There might even be an example in your book!
 
is the normal equation At *A*u = At *b, At being A transpose?
 
ok, party 'a' done, if anyone cares
 
Back
Top