Dimension and solution to matrix-vector product

  • Context: Undergrad 
  • Thread starter Thread starter DumpmeAdrenaline
  • Start date Start date
  • Tags Tags
    equations Linear algebra
Click For Summary

Discussion Overview

The discussion revolves around the properties of matrix-vector products, specifically focusing on the implications of rank and dimensionality in the context of linear equations. Participants explore the geometric interpretation of solutions to the equation $$X\beta=y$$, where $$X$$ is a matrix and $$\beta$$ is a vector. The conversation touches on concepts such as independent equations, geometric objects in n-dimensional space, and the conditions under which infinite solutions may arise.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that if $$X$$ has rank less than $$m$$, then there exists at least one equation that can be expressed as a linear combination of others, leading to infinite solutions when $$y$$ is in the column space.
  • Others argue that the matrix equation represents $$n-1$$ equations in $$n$$ unknowns, and if these equations are linearly independent, the solution space is a line in $$n$$-dimensional space, suggesting infinite solutions.
  • A later reply questions the reduction of the problem, stating that with a 3x3 matrix, if all rows are linearly independent, there is a unique solution, while if two rows are independent, the solution lies along the line of intersection of the corresponding planes.
  • Participants discuss the geometric interpretation of independent row vectors spanning a plane or higher-dimensional objects, and how this relates to finding a vector that satisfies the matrix equation.

Areas of Agreement / Disagreement

Participants express differing views on whether the problem can be reduced based on the dimensionality of the row space and the implications of linear independence. There is no consensus on the nature of the solution space or the reduction of the problem.

Contextual Notes

Participants note that the notation used may vary, leading to potential confusion. The discussion also highlights the dependence on the definitions of linear independence and the geometric interpretation of the equations involved.

DumpmeAdrenaline
Messages
80
Reaction score
2
Let $$ X \in R^{m*n} $$ where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?

I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.
 
Physics news on Phys.org
DumpmeAdrenaline said:
Let $$ X \in R^{m*n} $$
Your notation is unusual, so difficult to follow.
Many textbooks would write something like this:
Let ##A \in M^{n \times n}## where the coefficients of A are real.
DumpmeAdrenaline said:
where m=n with rank(X)<m then there is at-least one equation which can be written as a linear combination of other equations. Let $$ \beta \in R^{n} $$.
$$ X\beta=y $$
Or more clearly,
Let ##\vec x \in \mathbb R^n## and let ##\vec y = A\vec x##.
DumpmeAdrenaline said:
Suppose we have x<m independent equations (the equations are consistent) formed by taking the dot product of x row vectors of with the column vector. Each independent equation represents a geometrical object of dimension n-1 (n-1 degrees of freedom) . We have x geometrical objects in n dimensions and we are trying to find the intersection of all these geometrical objects that satisfies the RHS represented by y. The dimension of row space is x which corresponds to the # of independent equations. Can we say that we are reducing the problem to finding a vector x that is perpendicular to all the x row vectors that all lie on some geometric object n-1? If this is the case, why there are infinite solutions?
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
DumpmeAdrenaline said:
I understand why we have infinite solutions if we think of X in terms of column vectors. If y is in the column space, we can check this by comparing the rank of X and rank of X augmented with y. We have infinite possibilities for scalars of the independent column vectors in the sum of independent columns that would yield y.

[/quote]
 
Mark44 said:
I don't see how this reduces the problem. The matrix equation (in my notation) ##\vec y = A\vec x## represents n - 1 equations in n unknowns. Each equation represents a hyperplane in n-dimensional space. If the n - 1 equations are linearly independent, then the solution space is a line in n-dimensional space, this there are an infinite number of solutions.
Unfortunately this is the notation used by the author of the textbook I am studying from. How do we have n-1 system of equations when we take the dot product of n rows of matrix A with n entries of the column vector x. Can we think of it this way:

Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.

In a similar way, suppose we have have x<n linearly independent vectors. The x independent row vectors span some some geometrical object in n-dimensional space where all row vectors lie in. We are trying to find a column vector which when dotted to the n row vectors yields the entries in the column vector y.
 

Attachments

  • ff6a1de3-6929-44a5-972b-071923886b55.jpg
    ff6a1de3-6929-44a5-972b-071923886b55.jpg
    17.6 KB · Views: 113
DumpmeAdrenaline said:
Given a 3*3 matrix A, where 2 row vectors of the matrix A are linearly independent then the 2 row vectors span a plane in 3 D and the 3rd vector lies in that plane.
Again, I don't see how this reduces the problem. With a 3 x 3 matrix, if all three rows are linearly independent, then there is a unique solution. If two of the rows are linearly independent, and the third row is a linear combination of the other two rows, then the solution is all points on the line of intersection of the two planes represented by the independent rows. If two of the rows are multiples of one of the rows, then the solution is all points on the plane represented by one of the rows.

In what sense does this reduce the problem?
 
I thought it reduces the problem in the sense that all the independent vectors that span the row space will lie on a geometric object of dimension x.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K