Linear polynomial least squares

In summary, the conversation discusses constructing the normal equations for linear polynomial least squares to fit data, finding parameters and plotting a fit curve, calculating eigenvalues from a characteristic polynomial, writing a quadratic interpolation polynomial, and discussing conditions for a zero eigenvalue in the normal equation matrix. The conversation also mentions using definitions and examples to approach the problem.
  • #1
PhysicPhanatic
5
0
Construct the normal equations for the linear polynomial least squares to fit the data x = [1 0 -1], y=[3;2;-1]. (a) Find the parameters of the linear regression u1, u2 using QR decomposition, and plot the data and the fit curve in a graph (paper and pencil). (b) Calculate the eigenvalues of the normal equation matrix A'*A for the above data from the characteristic polynomial. (c) Write down quadratic interpolation polynomial in Lagrange form to interpolate the three data points. (d) Assume that the normal equation matrix A'*A is generated by A with m rows and n columns, m>=n. Explain under which conditions there would be a zero eigenvalue among the eigenvalues of the matrix of the normal equations.
 
Physics news on Phys.org
  • #2
anyone have any idea on how to do this?
 
  • #3
A more important question is if you have any ideas.
 
  • #4
the answer is NO, anyone else?
 
  • #5
Definitions are always a good place to start. There might even be an example in your book!
 
  • #6
is the normal equation At *A*u = At *b, At being A transpose?
 
  • #7
ok, party 'a' done, if anyone cares
 

1. What is a linear polynomial least squares?

A linear polynomial least squares is a statistical method used to find the best fitting line for a set of data points. It minimizes the sum of the squared differences between the data points and the line, allowing for the prediction of values within the range of the data.

2. How is a linear polynomial least squares calculated?

The calculation involves finding the slope and y-intercept of the line that minimizes the sum of the squared differences. This is done by using the equation y = mx + b, where m is the slope and b is the y-intercept. The values of m and b are determined by minimizing the sum of the squared errors.

3. What is the purpose of using linear polynomial least squares?

The purpose of using linear polynomial least squares is to find the best fitting line for a set of data points. This can be useful for making predictions or analyzing trends in the data.

4. What are the assumptions made in linear polynomial least squares?

The main assumptions made in linear polynomial least squares are that the relationship between the variables is linear and that the errors are normally distributed.

5. What are the limitations of linear polynomial least squares?

Linear polynomial least squares can only be used to find the best fitting line for linear relationships between variables. It also assumes that the errors are normally distributed, which may not always be the case in real-world data. Additionally, the method may not be suitable for datasets with outliers or extreme values.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
457
  • Programming and Computer Science
Replies
4
Views
617
  • Introductory Physics Homework Help
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Introductory Physics Homework Help
Replies
7
Views
5K
Replies
4
Views
2K
Replies
30
Views
2K
  • Introductory Physics Homework Help
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
3K
Back
Top