Using Least Squares to find Orthogonal Projection

In summary, the student is trying to solve a problem involving vectors and matrices, but is having difficulty. They took their vectors and set up a matrix, but are now needing to solve for the orthogonal projection. They are able to do this by using QR factorization and solving for the projectoin vector.
  • #1
ver_mathstats
260
21
Homework Statement
Use least squares to find the orthogonal projection of u onto the subspace of R4 spanned by the vectors v1, v2, and v3.
Relevant Equations
u = (0,5,4,0) v1 = (6,0,0,1) v2 = (0,1,-1,0) v3 = (1,1,0,-6)
I'm a little confused how to do this homework problem, I can't seem to obtain the correct answer. I took my vectors v1, v2, and v3 and set up a matrix. So I made my matrix:

V = [ (6,0,0,1)T, (0,1,-1,0)T, (1,1,0,-6)T ] and then I had
u = [ (0,5,4,0) T ].

I then went to solve using least squares. So I ended up doing VTV = VTu.

I took that, obtained my augmented matrix:

[ (37,0,0)T, (0,2,1)T, (0,1,38)T, (0,1,5)T ].

I solved it and everything worked fine, I got the answers: x1 = 0, x2 = 11/25, x3 = 3/25.

I noticed to that I could use QR factorization of V then solve QTu and the least squares solution would be x' satisfies Rx = QTu, we can then obtain our answer, but again this solution only has three components, why is the solution asking for four?

However, the solution requires four components. I am unsure of where I am going wrong with this problem, any help would be appreciated.
 
Physics news on Phys.org
  • #2
ver_mathstats said:
Homework Statement:: Use least squares to find the orthogonal projection of u onto the subspace of R4 spanned by the vectors v1, v2, and v3.
Relevant Equations:: u = (0,5,4,0) v1 = (6,0,0,1) v2 = (0,1,-1,0) v3 = (1,1,0,-6)

I'm a little confused how to do this homework problem, I can't seem to obtain the correct answer. I took my vectors v1, v2, and v3 and set up a matrix. So I made my matrix:

V = [ (6,0,0,1)T, (0,1,-1,0)T, (1,1,0,-6)T ] and then I had
u = [ (0,5,4,0) T ].

I then went to solve using least squares. So I ended up doing VTV = VTu.

I took that, obtained my augmented matrix:

[ (37,0,0)T, (0,2,1)T, (0,1,38)T, (0,1,5)T ].

I solved it and everything worked fine, I got the answers: x1 = 0, x2 = 11/25, x3 = 3/25.

I noticed to that I could use QR factorization of V then solve QTu and the least squares solution would be x' satisfies Rx = QTu, we can then obtain our answer, but again this solution only has three components, why is the solution asking for four?

However, the solution requires four components. I am unsure of where I am going wrong with this problem, any help would be appreciated.
You have to find the orthogonal projection of u onto the subspace of R4 spanned by the vectors v1, v2, and v3. x1, x2, x3 you got are the componets of this projection vector P with respect to the basis v1, v2, v3, that is
P= x1 v1 +x2 v2+x3 v3 , a linear combination of three four-dimensional vectors .
 
  • Like
Likes andrewkirk

What is Least Squares method?

The Least Squares method is a mathematical technique used to find the best fit line or curve for a set of data points. It minimizes the sum of the squared differences between the actual data points and the predicted values.

How is Least Squares method used to find Orthogonal Projection?

Using Least Squares method, we can find the coefficients of the best fit line or plane for a set of data points. These coefficients can then be used to calculate the projection of a vector onto the subspace spanned by the data points, giving us the orthogonal projection.

Why is Orthogonal Projection important?

Orthogonal Projection is important because it allows us to find the closest approximation of a vector in a subspace. This is useful in many applications such as data analysis, signal processing, and machine learning.

What are the assumptions made in using Least Squares method for Orthogonal Projection?

The main assumptions made in using Least Squares method for Orthogonal Projection are that the data points are independent, the errors are normally distributed, and the errors have equal variance. Violation of these assumptions can lead to inaccurate results.

Are there any limitations to using Least Squares method for Orthogonal Projection?

Yes, there are some limitations to using Least Squares method for Orthogonal Projection. It assumes that the data points are in a linear subspace and that the errors are normally distributed. It may not work well for non-linear data or data with outliers. Additionally, it can be computationally expensive for large datasets.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
345
  • Introductory Physics Homework Help
Replies
4
Views
535
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
4K
  • Calculus and Beyond Homework Help
Replies
6
Views
3K
  • Calculus and Beyond Homework Help
Replies
11
Views
4K
  • Calculus and Beyond Homework Help
2
Replies
46
Views
7K
  • Calculus and Beyond Homework Help
Replies
6
Views
7K
Back
Top