Least square problem matrix

In summary, if a matrix A has linearly dependent columns and a vector b is given, then the least square solution is not unique. This is because having linearly dependent columns means there are not enough independent equations to determine a unique solution. The "projection" onto the column space of A may be unique, but the least square solution is not. Additionally, in order to calculate the inverse of (X'X), the matrix must have linearly independent columns.
  • #1
td21
Gold Member
177
8

Homework Statement


If a matrix A has linearly dependent columns and b is a vector, then the least square solution is not unique.


Homework Equations





The Attempt at a Solution


I know that the "projection" onto column space of A is unique, but why the least square solution isn't?
 
Physics news on Phys.org
  • #2
help?
 
  • #3
If you have linearly dependent columns, you have too few independent equations to get a unique? Its like trying to describe only one point with a line.
Or am I wrong? It's two years since I had the course
 
  • #4
http://en.wikipedia.org/wiki/Linear_least_squares_(mathematics )

B = (X'X)^-1 X'y

(X'X)^-1 doesn't it have to have linearly independent columns if you want to calculate the inverse?

Again I'm only guessing
 
Last edited by a moderator:

1. What is the least square problem matrix?

The least square problem matrix is a mathematical tool used in statistics and data analysis to find the best fitting line or curve for a set of data points. It aims to minimize the sum of squared distances between the data points and the fitted line or curve.

2. How is the least square problem matrix used in regression analysis?

The least square problem matrix is used in regression analysis to find the coefficients of the regression equation. This equation represents the relationship between the independent and dependent variables in a dataset. The matrix helps to minimize the error between the predicted and actual values, making the regression model more accurate.

3. What are the assumptions of the least square problem matrix?

The main assumptions of the least square problem matrix are linearity, constant variance, independence of errors, and normality of errors. These assumptions ensure that the fitted line or curve is a good representation of the relationship between the variables and that the errors are randomly distributed.

4. Can the least square problem matrix be used for non-linear data?

No, the least square problem matrix is only suitable for linear data. If the data is non-linear, a different method, such as non-linear least squares, must be used to find the best fitting line or curve.

5. How do you calculate the least square problem matrix?

The least square problem matrix is calculated by taking the partial derivatives of the sum of squared errors with respect to the coefficients of the regression equation. These derivatives are set to zero and solved to find the values of the coefficients that minimize the sum of squared errors.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
942
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
591
  • Calculus and Beyond Homework Help
Replies
8
Views
793
  • Calculus and Beyond Homework Help
Replies
3
Views
568
  • Calculus and Beyond Homework Help
Replies
2
Views
959
  • Calculus and Beyond Homework Help
Replies
1
Views
276
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
384
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Back
Top