Decompose matrix into outer product of vectors

Click For Summary

Discussion Overview

The discussion revolves around techniques for decomposing a 3D matrix into an outer product of vectors. Participants explore the conditions under which such a decomposition is possible, particularly focusing on the implications of the matrix's rank.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant inquires about methods to decompose a 3D matrix into the form M_{i,j,k} = a_{i}b_{i}c_{i}, emphasizing the need for clarity on the conditions for such a decomposition.
  • Another participant notes that a matrix of rank one can be expressed as an outer product, but questions the uniqueness of the decomposition and the implications of higher ranks.
  • A different perspective suggests that while rank one is necessary for a matrix to be represented as an outer product, it may not be sufficient, citing examples of rank one matrices that cannot be decomposed in the desired form.
  • One participant proposes that Singular Value Decomposition (SVD) could provide insights into approximating matrices using outer products, though they express uncertainty regarding its application to 3D arrays.

Areas of Agreement / Disagreement

Participants express differing views on the conditions required for matrix decomposition, particularly regarding the implications of rank one versus higher ranks. There is no consensus on whether a decomposition is possible for matrices of rank greater than one.

Contextual Notes

Participants discuss the limitations of rank one matrices and the challenges posed by matrices of higher rank, indicating that the discussion remains open to further exploration of these concepts.

Pacopag
Messages
193
Reaction score
4
Hi. I'm wondering if anyone can point me to any information on techniques to decompose a matrix (actually a 3D matrix) into an outer product of vectors. Particularly, given M_{i,j,k}, I want to find vectors a_{i}, b_{i} and c_{i} such that

<br /> M_{i,j,k} = a_{i}b_{i}c_{i}<br />
where the multiplication on the right is an outer product.

I've read that this is only possible if the matrix M has a rank of one, but I can find anything on how to actually decompose the matrix, only that it CAN BE done. Also, if M has rank one, does that mean that there is a "unique" decomposition? What if the rank is something other than one? In that case would it be possible to find a family of solutions?

Thanks for any help.
 
Physics news on Phys.org
Edit: strike all this, you meant a different outer product. Lemme think on this a bit more.
 
Last edited:
Consider the matrix a_ib_j. This matrix is rank 1 since only one row and one column of this matrix is independently determined. The others rows(columns) bear constant ratios to this one row(column). Now if you take the ratio of successive elements in each row, it is easy to calculate a_1/a_2,b_2/b_3\ldots etc.

Now write these ratios in terms of say a_1 and b_1. We name the ratios with the letter k, e.g., a_2=k^a_2a_1 (superscript does not denote exponent). Once you substitute all the k^as and the k^bs, you get a matrix where a_1b_1 can be factored out and its value is known.

At this point, we cannot determine a_1 and b_1 individually. You can select a_1=p and b_1=a_1b_1/p and the resulting vector pairs will all work.

With a little extra work, this approach can be extended to higher order matrices.
 
That's great Bavid! Is there ANYTHING that can be done if the matrix is not of rank 1? Or is it the case that no such decomposition exists, even up to undetermined coefficients, if the rank is anything other than 1?
 
It appears to me that rank 1 is necessary but not sufficient for a matrix to be represented as a_ib_j. For example, rank one matrices may have rows differing by a constant, as in (row_i)=m*(row_j)+c which could be rank 1 but not decomposable to a_ib_j.

Vector outer product ALWAYS produces a rank 1 matrix, at least among 3*3 matrices..think about it, there are 6 independent components in the constituent vectors but 9 components in the resulting matrix. So there is a well-defined interdependence of the matrix components. You cannot hope for any random matrix to be expressed as an outer product.
 
That's great. Thanks.
 
One way to look at the Singular Value Decomposition (SVD) of a matrix is that it tells you how to express the matrix as a linear combination of outer products of vectors. Thus it may suggest away to approximate a matrix by using only some of the outer products in the linear combination.

I don't know what the SVD tells us about 3D arrays. We'd have to think about that.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K