Can any matrix be expressed as the product of two vectors?

  • #1
204
35

Main Question or Discussion Point

For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?
 

Answers and Replies

  • #3
hilbert2
Science Advisor
Insights Author
Gold Member
1,339
414
Let's say there are a column vector ##A = \begin{bmatrix}a \\ b\end{bmatrix}## and row vector ##B = \begin{bmatrix}c & d\end{bmatrix}## that have

##AB = \begin{bmatrix}ac & ad \\ bc & bd\end{bmatrix} = \begin{bmatrix}0 & 1 \\ 1 & 0\end{bmatrix}##.

Is this possible, knowing that ##xy = 0## for real numbers ##x,y## implies that either ##x=0## or ##y=0## ?
 
  • #4
204
35
Let's say there are a column vector ##A = \begin{bmatrix}a \\ b\end{bmatrix}## and row vector ##B = \begin{bmatrix}c & d\end{bmatrix}## that have

##AB = \begin{bmatrix}ac & ad \\ bc & bd\end{bmatrix} = \begin{bmatrix}0 & 1 \\ 1 & 0\end{bmatrix}##.

Is this possible, knowing that ##xy = 0## for real numbers ##x,y## implies that either ##x=0## or ##y=0## ?
Thanks. I was trying to think of a counter example. This is very obvious.
 
  • #5
RPinPA
Science Advisor
Homework Helper
565
318
No, such a matrix has rank 1. Using the notation that ##\mathbf w^T## is the row vector of ##\mathbf w##, then row 1 of M is ##v_1 \mathbf w^T##, row 2 is ##v_2 \mathbf w^T##, etc. Every row is a multiple of every other row. And every column is a multiple of every other column, as all have the form ##w_j \mathbf v##.

And in particular, any invertible matrix 2x2 or larger is going to be a counterexample.

What is true is that you can express any matrix M of rank n as a sum of n rank-1 matrices ##\sum_{i=1}^n {\mathbf v_i \mathbf w_i^T}##.
 
Last edited:
  • #6
12,652
9,172
For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?
It can always be done by a linear combination of those where ##\operatorname{rank}M## is the minimal length of it.
 
  • #7
Stephen Tashi
Science Advisor
7,016
1,237
For example, does this always hold true?

M_ab = v_a × w_b

If not, where does it break down?
You didn't say how you define the "product of two vectors". Lets assume you are thinking of this type of example.
v_a = (a1, a2)
v_b = (b1,b,2,b3)
The "product" is a table of data with 2 rows and 3 columns. The (i,j) entry of the table is (a_i)(b_j).

It would be nice if all data tables were so simple! A person who could do multiplication wouldn't need the body of the table.

Thinking about the various complicated data tables that we encounter makes it clear that not all data tables (matrices) have such a simple structure.

However, as @fresh_42 indicates in post #6, all data tables can be written as linear combinations of such simple data tables. This is a remarkable and important fact. It's a concrete interpretation of the "singular value decomposition" of a matrix.
 
  • #8
hilbert2
Science Advisor
Insights Author
Gold Member
1,339
414
Also, I guess the determinant of this kind of matrices (if they're square) is always zero, as the columns are multiples of each other. This is a very limiting property for a matrix.
 
  • #9
12,652
9,172
Also, I guess the determinant of this kind of matrices (if they're square) is always zero, as the columns are multiples of each other. This is a very limiting property for a matrix.
Sure, they are rank one matrices.

The question gets interesting, if we ask for the minimal length of linear combinations of ##x\otimes y \otimes z## to represent a given bilinear mapping, e.g. matrix multiplication. If we define the matrix exponent ## \omega := \min\{\,\gamma\,|\,(A;B) \longmapsto A\cdot B = \sum_{i}^Rx_i(A) \otimes y_i(B) \otimes Z_i \,\wedge \, R=O(n^\gamma)\,\} ## then ##2\leq \omega \leq 2.3727## and we do not know how close we can come to the lower bound.

Two is the lower and three the trivial upper bound. I find this fascinating, as it somehow contains the question whether there are intrinsically difficult problems out there, or whether we just haven't found the right clue - similar to NP=P and ERH.
 
  • #10
hilbert2
Science Advisor
Insights Author
Gold Member
1,339
414
The question gets interesting, if we ask for the minimal length of linear combinations of ##x\otimes y \otimes z## to represent a given bilinear mapping, e.g. matrix multiplication. If we define the matrix exponent ## \omega := \min\{\,\gamma\,|\,(A;B) \longmapsto A\cdot B = \sum_{i}^Rx_i(A) \otimes y_i(B) \otimes Z_i \,\wedge \, R=O(n^\gamma)\,\} ## then ##2\leq \omega \leq 2.3727## and we do not know how close we can come to the lower bound.
Thanks, this was really an interesting topic based on a thesis I found with Google search, and doesn't require too many concepts unfamiliar for me to understand.
 

Related Threads on Can any matrix be expressed as the product of two vectors?

  • Last Post
Replies
7
Views
1K
Replies
5
Views
654
Replies
5
Views
1K
  • Last Post
Replies
11
Views
679
Replies
3
Views
2K
  • Last Post
Replies
1
Views
1K
Replies
6
Views
1K
  • Last Post
Replies
1
Views
1K
Replies
25
Views
2K
Top