Decomposing Vectors Using Row Reduction: A Practical Approach

  • Context: Undergrad 
  • Thread starter Thread starter dman12
  • Start date Start date
  • Tags Tags
    Decomposition Vector
Click For Summary

Discussion Overview

The discussion revolves around the decomposition of a vector into a best fit linear superposition of given vectors, specifically using techniques such as least squares and row reduction. The context includes theoretical and practical applications in linear algebra.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant seeks a method to decompose a vector into a linear combination of other vectors to achieve the closest fit.
  • Another participant suggests using the least squares solution, explaining the formulation of the problem in terms of matrix products and minimization of the norm.
  • A different participant expresses a concern about the dimensionality, questioning whether the three vectors can adequately represent the target vector and whether projections can be applied to the remaining dimensions.
  • Another reply indicates that the problem can be approached by solving the linear system in Reduced Row Echelon Form (RREF), emphasizing the importance of understanding row reduction in linear algebra.

Areas of Agreement / Disagreement

Participants present multiple viewpoints on the approach to vector decomposition, with no consensus reached on the best method or the implications of dimensionality in the context of the problem.

Contextual Notes

There are unresolved assumptions regarding the dimensionality of the vector space and the applicability of projections, as well as the specific conditions under which the least squares method is effective.

dman12
Messages
11
Reaction score
0
Hello,

I am trying to figure out how to best decompose a vector into a best fit linear superposition of other, given vectors.

For instance is there a way of finding the best linear sum of:

(3,5,7,0,1)
(0,0,4,5,7)
(8,9,2,0,4)

That most closely gives you (1,2,3,4,5)

My problem contains more, higher order vectors so if there is a general statistical way of doing a decomposition like this that would be great.

Thanks!
 
Physics news on Phys.org
You can use least square solution. First, realize that you can express a linear combination of ##n## ##m\times 1## column vectors as a matrix product between a matrix formed by placing those ##n## columns next to each other and a ##n \times 1## column vector consisting of the coefficients of each vector in the sum. Denote the first matrix as ##A## and the second (column) one as ##x##, you are to find ##x## such that ##||Ax-b||## is minimized where ##b## is the ##m \times 1## column vector you want to fit to.
 
My hunch was that the three vectors span a 3D space in which you can express the part of (1,2,3,4,5) that lies in that space exactly (by projections). For the two other dimensions there's nothing you can do. Am I deceiving myself ?
 
Hey dman12.

This is equivalent to solving the linear system in RREF.

Understanding this process of row reduction and why it works will help you understand a lot of linear algebra in a practical capacity.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 26 ·
Replies
26
Views
6K
  • · Replies 11 ·
Replies
11
Views
4K
  • Poll Poll
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 6 ·
Replies
6
Views
3K