Can any matrix be expressed as the product of two vectors?

Click For Summary
SUMMARY

The discussion centers on the expression of matrices as the product of two vectors, specifically through the outer product. It is established that not all matrices can be represented in this form, particularly those with rank greater than one, as they cannot be expressed as a single outer product of two vectors. The conversation highlights that any matrix of rank n can be represented as a sum of n rank-1 matrices, which is related to the concept of singular value decomposition. Additionally, the determinant of such rank-1 matrices is always zero due to their column dependencies.

PREREQUISITES
  • Understanding of matrix rank and properties
  • Familiarity with outer products and vector multiplication
  • Knowledge of singular value decomposition (SVD)
  • Basic concepts of determinants in linear algebra
NEXT STEPS
  • Study the properties of matrix rank and its implications
  • Learn about singular value decomposition (SVD) in detail
  • Explore the concept of outer products and their applications
  • Investigate determinants and their significance in linear algebra
USEFUL FOR

Mathematicians, data scientists, and students of linear algebra seeking to deepen their understanding of matrix representations and properties.

DuckAmuck
Messages
238
Reaction score
40
For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?
 
Physics news on Phys.org
Let's say there are a column vector ##A = \begin{bmatrix}a \\ b\end{bmatrix}## and row vector ##B = \begin{bmatrix}c & d\end{bmatrix}## that have

##AB = \begin{bmatrix}ac & ad \\ bc & bd\end{bmatrix} = \begin{bmatrix}0 & 1 \\ 1 & 0\end{bmatrix}##.

Is this possible, knowing that ##xy = 0## for real numbers ##x,y## implies that either ##x=0## or ##y=0## ?
 
  • Like
Likes   Reactions: mfb and DuckAmuck
hilbert2 said:
Let's say there are a column vector ##A = \begin{bmatrix}a \\ b\end{bmatrix}## and row vector ##B = \begin{bmatrix}c & d\end{bmatrix}## that have

##AB = \begin{bmatrix}ac & ad \\ bc & bd\end{bmatrix} = \begin{bmatrix}0 & 1 \\ 1 & 0\end{bmatrix}##.

Is this possible, knowing that ##xy = 0## for real numbers ##x,y## implies that either ##x=0## or ##y=0## ?

Thanks. I was trying to think of a counter example. This is very obvious.
 
No, such a matrix has rank 1. Using the notation that ##\mathbf w^T## is the row vector of ##\mathbf w##, then row 1 of M is ##v_1 \mathbf w^T##, row 2 is ##v_2 \mathbf w^T##, etc. Every row is a multiple of every other row. And every column is a multiple of every other column, as all have the form ##w_j \mathbf v##.

And in particular, any invertible matrix 2x2 or larger is going to be a counterexample.

What is true is that you can express any matrix M of rank n as a sum of n rank-1 matrices ##\sum_{i=1}^n {\mathbf v_i \mathbf w_i^T}##.
 
Last edited:
DuckAmuck said:
For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?
It can always be done by a linear combination of those where ##\operatorname{rank}M## is the minimal length of it.
 
  • Like
Likes   Reactions: Stephen Tashi
DuckAmuck said:
For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?

You didn't say how you define the "product of two vectors". Let's assume you are thinking of this type of example.
v_a = (a1, a2)
v_b = (b1,b,2,b3)
The "product" is a table of data with 2 rows and 3 columns. The (i,j) entry of the table is (a_i)(b_j).

It would be nice if all data tables were so simple! A person who could do multiplication wouldn't need the body of the table.

Thinking about the various complicated data tables that we encounter makes it clear that not all data tables (matrices) have such a simple structure.

However, as @fresh_42 indicates in post #6, all data tables can be written as linear combinations of such simple data tables. This is a remarkable and important fact. It's a concrete interpretation of the "singular value decomposition" of a matrix.
 
Also, I guess the determinant of this kind of matrices (if they're square) is always zero, as the columns are multiples of each other. This is a very limiting property for a matrix.
 
hilbert2 said:
Also, I guess the determinant of this kind of matrices (if they're square) is always zero, as the columns are multiples of each other. This is a very limiting property for a matrix.
Sure, they are rank one matrices.

The question gets interesting, if we ask for the minimal length of linear combinations of ##x\otimes y \otimes z## to represent a given bilinear mapping, e.g. matrix multiplication. If we define the matrix exponent ## \omega := \min\{\,\gamma\,|\,(A;B) \longmapsto A\cdot B = \sum_{i}^Rx_i(A) \otimes y_i(B) \otimes Z_i \,\wedge \, R=O(n^\gamma)\,\} ## then ##2\leq \omega \leq 2.3727## and we do not know how close we can come to the lower bound.

Two is the lower and three the trivial upper bound. I find this fascinating, as it somehow contains the question whether there are intrinsically difficult problems out there, or whether we just haven't found the right clue - similar to NP=P and ERH.
 
  • #10
fresh_42 said:
The question gets interesting, if we ask for the minimal length of linear combinations of ##x\otimes y \otimes z## to represent a given bilinear mapping, e.g. matrix multiplication. If we define the matrix exponent ## \omega := \min\{\,\gamma\,|\,(A;B) \longmapsto A\cdot B = \sum_{i}^Rx_i(A) \otimes y_i(B) \otimes Z_i \,\wedge \, R=O(n^\gamma)\,\} ## then ##2\leq \omega \leq 2.3727## and we do not know how close we can come to the lower bound.

Thanks, this was really an interesting topic based on a thesis I found with Google search, and doesn't require too many concepts unfamiliar for me to understand.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K