Dimension and solution to matrix-vector product

  • Context: Undergrad 
  • Thread starter Thread starter DumpmeAdrenaline
  • Start date Start date
  • Tags Tags
    equations Linear algebra
Click For Summary
SUMMARY

The discussion centers on the matrix-vector product represented by the equation $$ X\beta=y $$, where $$ X \in R^{m*n} $$ and rank(X) < m. Participants explore the implications of having fewer independent equations than variables, leading to infinite solutions. The geometric interpretation of independent equations as hyperplanes in n-dimensional space is emphasized, with the conclusion that the solution space is determined by the dimensionality of the row space and the relationships between the row vectors.

PREREQUISITES
  • Understanding of linear algebra concepts, particularly matrix rank and vector spaces.
  • Familiarity with the geometric interpretation of linear equations and hyperplanes.
  • Knowledge of matrix notation and operations, specifically dot products.
  • Experience with solving systems of linear equations and understanding solution spaces.
NEXT STEPS
  • Study the concept of matrix rank and its implications on solution uniqueness in linear systems.
  • Learn about the geometric interpretation of linear transformations and their effect on vector spaces.
  • Explore the relationship between row space and column space in matrices, particularly in relation to infinite solutions.
  • Investigate the properties of hyperplanes in n-dimensional space and their intersections.
USEFUL FOR

Mathematicians, data scientists, and engineers involved in linear algebra, particularly those working with systems of equations and matrix computations.

DumpmeAdrenaline
Messages
80
Reaction score
2
Let $$ X \in R^{m*n} $$ where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?

I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.
 
Physics news on Phys.org
DumpmeAdrenaline said:
Let $$ X \in R^{m*n} $$
Your notation is unusual, so difficult to follow.
Many textbooks would write something like this:
Let ##A \in M^{n \times n}## where the coefficients of A are real.
DumpmeAdrenaline said:
where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Or more clearly,
Let ##\vec x \in \mathbb R^n## and let ##\vec y = A\vec x##.
DumpmeAdrenaline said:
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
DumpmeAdrenaline said:
I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.

[/quote]
 
Mark44 said:
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
Unfortunately this is the notation used by the author of the textbook I am studying from. How do we have n-1 system of equations when we take the dot product of n rows of matrix A with n entries of the column vector x. Can we think of it this way:

Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.

In a similar way, suppose we have have x<n linearly independent vectors. The x independent row vectors span some some geometrical object in n-dimensional space where all row vectors lie in. We are trying to find a column vector which when dotted to the n row vectors yields the entries in the column vector y.
 

Attachments

  • ff6a1de3-6929-44a5-972b-071923886b55.jpg
    ff6a1de3-6929-44a5-972b-071923886b55.jpg
    17.6 KB · Views: 109
DumpmeAdrenaline said:
Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.
Again, I don't see how this reduces the problem. With a 3 x 3 matrix, if all three rows are linearly independent, then there is a unique solution. If two of the rows are linearly independent, and the third row is a linear combination of the other two rows, then the solution is all points on the line of intersection of the two planes represented by the independent rows. If two of the rows are multiples of one of the rows, then the solution is all points on the plane represented by one of the rows.

In what sense does this reduce the problem?
 
I thought it reduces the problem in the sense that all the independent vectors that span the row space will lie on a geometric object of dimension x.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K