Decompose matrix into outer product of vectors

In summary, the conversation discusses the possibility of decomposing a 3D matrix into an outer product of vectors. It is mentioned that this is only possible if the matrix has a rank of one and that the resulting matrix will always have a rank of one. The conversation also mentions the Singular Value Decomposition (SVD) as a possible method for approximating a matrix using outer products. However, it is noted that it is not clear what the SVD tells us about 3D arrays.
  • #1
Pacopag
197
4
Hi. I'm wondering if anyone can point me to any information on techniques to decompose a matrix (actually a 3D matrix) into an outer product of vectors. Particularly, given [tex]M_{i,j,k}[/tex], I want to find vectors [tex]a_{i}[/tex], [tex]b_{i}[/tex] and [tex]c_{i}[/tex] such that

[tex]
M_{i,j,k} = a_{i}b_{i}c_{i}
[/tex]
where the multiplication on the right is an outer product.

I've read that this is only possible if the matrix M has a rank of one, but I can find anything on how to actually decompose the matrix, only that it CAN BE done. Also, if M has rank one, does that mean that there is a "unique" decomposition? What if the rank is something other than one? In that case would it be possible to find a family of solutions?

Thanks for any help.
 
Physics news on Phys.org
  • #2
Edit: strike all this, you meant a different outer product. Lemme think on this a bit more.
 
Last edited:
  • #3
Consider the matrix [tex]a_ib_j[/tex]. This matrix is rank 1 since only one row and one column of this matrix is independently determined. The others rows(columns) bear constant ratios to this one row(column). Now if you take the ratio of successive elements in each row, it is easy to calculate [tex]a_1/a_2,b_2/b_3\ldots[/tex] etc.

Now write these ratios in terms of say [tex]a_1[/tex] and [tex]b_1[/tex]. We name the ratios with the letter [tex]k[/tex], e.g., [tex]a_2=k^a_2a_1[/tex] (superscript does not denote exponent). Once you substitute all the [tex]k^a[/tex]s and the [tex]k^b[/tex]s, you get a matrix where [tex]a_1b_1[/tex] can be factored out and its value is known.

At this point, we cannot determine [tex]a_1[/tex] and [tex]b_1[/tex] individually. You can select [tex]a_1=p[/tex] and [tex]b_1=a_1b_1/p[/tex] and the resulting vector pairs will all work.

With a little extra work, this approach can be extended to higher order matrices.
 
  • #4
That's great Bavid! Is there ANYTHING that can be done if the matrix is not of rank 1? Or is it the case that no such decomposition exists, even up to undetermined coefficients, if the rank is anything other than 1?
 
  • #5
It appears to me that rank 1 is necessary but not sufficient for a matrix to be represented as a_ib_j. For example, rank one matrices may have rows differing by a constant, as in (row_i)=m*(row_j)+c which could be rank 1 but not decomposable to a_ib_j.

Vector outer product ALWAYS produces a rank 1 matrix, at least among 3*3 matrices..think about it, there are 6 independent components in the constituent vectors but 9 components in the resulting matrix. So there is a well-defined interdependence of the matrix components. You cannot hope for any random matrix to be expressed as an outer product.
 
  • #6
That's great. Thanks.
 
  • #7
One way to look at the Singular Value Decomposition (SVD) of a matrix is that it tells you how to express the matrix as a linear combination of outer products of vectors. Thus it may suggest away to approximate a matrix by using only some of the outer products in the linear combination.

I don't know what the SVD tells us about 3D arrays. We'd have to think about that.
 

What does it mean to "decompose matrix into outer product of vectors"?

Decomposing a matrix into the outer product of vectors means to express the matrix as the sum of the products of column vectors and row vectors. This decomposition is useful for understanding the structure and properties of the matrix.

How do you decompose a matrix into outer product of vectors?

To decompose a matrix into outer product of vectors, you can use a technique called Singular Value Decomposition (SVD). This involves finding the eigenvalues and eigenvectors of the matrix and using them to construct the outer product representation.

What are the benefits of decomposing a matrix into outer product of vectors?

Decomposing a matrix into outer product of vectors can provide insights into the structure and properties of the matrix, such as its rank, determinant, and eigenvalues. It can also be used for data compression, dimensionality reduction, and solving systems of linear equations.

Can any matrix be decomposed into outer product of vectors?

Yes, any matrix can be decomposed into the outer product of vectors. However, the vectors used in the decomposition may not be unique. Additionally, if the matrix is not square, the outer product representation will have to include additional vectors to account for the difference in dimensions.

Are there other methods for decomposing a matrix besides outer product of vectors?

Yes, there are other methods for decomposing a matrix, such as LU decomposition, QR decomposition, and Cholesky decomposition. Each method has its own advantages and uses, and the choice of method depends on the specific properties and structure of the matrix.

Similar threads

  • Linear and Abstract Algebra
Replies
2
Views
608
  • Linear and Abstract Algebra
Replies
9
Views
896
  • Linear and Abstract Algebra
Replies
4
Views
880
  • Linear and Abstract Algebra
Replies
9
Views
1K
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
33
Views
831
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
790
Back
Top