Least Squares Solutions for Inconsistent Linear Systems

In summary, the homework statement is trying to find a least squares solution to a system of equations with matrix A having rank 3. There is only one solution and it is in the 2-D subspace of ATA=ATb.
  • #1
StandardBasis
22
0

Homework Statement


Hi there! First time user, so I hope I do this right. The question is: Let A be an 8x5 matrix of rank 3, and let b be a nonzero vector in N(AT). First, prove that the system Ax=b must be inconsistent. Then, how many least squares solutions will the system have?


Homework Equations





The Attempt at a Solution


I got the first part fine (I think...). I assume that Ax=b is consistent, and look for a contradiction. There is one: for Ax=b to be consistent, b must be in the column space of A. But we are told b is in the nullspace of AT, which is the orthogonal complement of R(A). So b isn't actually in R(A), and thus Ax=b is inconsistent by the fact that b isn't a linear combination of A's columns.

Now for the second part, I know I need to find all solutions to ATAx=ATb. In class we proved that if A is mxn with rank n, then there is a unique solution to the normal equations. Obviously rank is not n here, so I'm not sure what to do. I'm used to systems having either 0, 1, or infinitely many solutions... so I'm tempted to say infinitely many? But I can't justify that.
 
Physics news on Phys.org
  • #2
ATA will be 5x5 rank 3, yes? And you want all solutions of ATAx=0. Won't that be a 2-D subspace?
 
  • #3
But I don't want to find the solutions to ATAx=0, I want to find solutions to ATAx=ATb.

Second, unless I misunderstand what it means to find a least squares solution, I thought I'm looking for the vector(s) that produce a projection of b (which isn't in the range aka column space of A) onto the range of A. If 2 vectors did do this, wouldn't linear combos of them also do it?
 
  • #4
StandardBasis said:
But I don't want to find the solutions to ATAx=0, I want to find solutions to ATAx=ATb.
You wrote that b is in the nullspace of AT, so isn't ATb zero?
Second, unless I misunderstand what it means to find a least squares solution, I thought I'm looking for the vector(s) that produce a projection of b (which isn't in the range aka column space of A) onto the range of A. If 2 vectors did do this, wouldn't linear combos of them also do it?
I don't understand how a vector can 'produce' a projection of another vector. But we agree you're looking for a subspace.
 
  • #5
Ah, yes, my mistake for forgetting that.

Probably not the best choice of words on my part, to say produce. I meant that in the sense that if z is the particular least squares solution we're talking about, the p=Az where p is the 'closest' vector to b. So I chose produce because not z itself, but the product of A and z gives the vector we are searching for.

So I now understand we're looking for a 2-D subspace. But why would that mean there are 2 least squares solutions, and not infinitely many? After all, there are infinitely many solutions to the normal equations (the entire 2-D subspace in question!)
 
  • #6
StandardBasis said:
So I now understand we're looking for a 2-D subspace. But why would that mean there are 2 least squares solutions, and not infinitely many?
It would indeed mean infinitely many. Who's saying there are two?
 
  • #7
Apologies, I thought your reference to a 2d subspace meant that the two basis vectors were the solutions.

Is there a way to visualize how infinitely many vectors can all lead to the same "closest vector" to b?
 
  • #8
StandardBasis said:
Is there a way to visualize how infinitely many vectors can all lead to the same "closest vector" to b?
Yes, I was wondering about that. Clearly it is what will happen if ATA is singular, so need to construct an example like that. Suppose some column is all zeroes. That means we have no information about that dimension, so we can vary the solution in that dimension without changing the sum of squares.
 
  • #9
Wait, one more question: how did I know that the product of A transpose and A has rank 3?
 
  • #10
StandardBasis said:
Wait, one more question: how did I know that the product of A transpose and A has rank 3?
I was rather hoping that was a standard result you could appeal to. It's 30 years since I did any of this stuff.
 
  • #11
Well, I guess I know that since A has rank 3, then transpose of A also has rank 3. The rank of a product is less than or equal to the minimum of the rank of the two multiplied matrices. So the rank is at least less than or equal to 3; but in any case, the product is not a matrix of full rank.

Is that enough? Perhaps someone with more immediate experience can provide a hint?
 
  • #12
StandardBasis said:
Well, I guess I know that since A has rank 3, then transpose of A also has rank 3. The rank of a product is less than or equal to the minimum of the rank of the two multiplied matrices. So the rank is at least less than or equal to 3; but in any case, the product is not a matrix of full rank.
Yes, that suffices for this question. More generally, I would think ATA would have the same rank as A, but I don't see how to prove it easily.
 

1. What is the purpose of Least Squares Linear Algebra?

The purpose of Least Squares Linear Algebra is to find the best fit line or curve that represents a set of data points. It is commonly used in statistics and data analysis to find the relationship between variables and make predictions.

2. How does Least Squares Linear Algebra differ from other regression methods?

Unlike other regression methods, Least Squares Linear Algebra minimizes the sum of the squared distances between the data points and the predicted line or curve. This allows for a more precise fit to the data and can handle non-linear relationships between variables.

3. What is the difference between simple and multiple linear regression?

Simple linear regression involves finding the relationship between two variables, while multiple linear regression involves finding the relationship between multiple variables. Least Squares Linear Algebra can be used for both types of regression, but multiple linear regression requires more complex calculations.

4. How do you interpret the results of Least Squares Linear Algebra?

The results of Least Squares Linear Algebra include the slope, intercept, and correlation coefficient of the best fit line or curve. The slope represents the rate of change between the variables, the intercept represents the starting point of the line or curve, and the correlation coefficient measures the strength and direction of the relationship between the variables.

5. What are some common applications of Least Squares Linear Algebra?

Least Squares Linear Algebra has various applications in fields such as economics, engineering, and social sciences. It can be used to analyze trends and make predictions in stock market data, weather patterns, and population growth, among others. It is also commonly used in machine learning and artificial intelligence algorithms.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
977
  • Calculus and Beyond Homework Help
Replies
14
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
14
Views
601
  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
24
Views
803
  • Calculus and Beyond Homework Help
Replies
7
Views
826
  • Calculus and Beyond Homework Help
Replies
2
Views
529
  • Calculus and Beyond Homework Help
Replies
7
Views
418
  • Calculus and Beyond Homework Help
Replies
8
Views
626
Back
Top