Singular values of a matrix times a diagonal matrix

Click For Summary
The discussion centers on determining the singular values and vectors of matrix A, defined as A = DB, where D is a diagonal matrix and B is a singular, circulant matrix with known singular value decomposition. It is suggested to utilize the eigenvalue decomposition (EVD) of B instead of the singular value decomposition (SVD), as circulant matrices have specific properties that simplify analysis. The EVD reveals that the singular values squared of B correspond to the eigenvalues of BB^H. However, multiplying B by the diagonal matrix D disrupts the circulant structure, complicating the relationship between A and B's singular values and vectors. Ultimately, no straightforward method is provided for determining the singular values of A from those of B.
jdevita
Messages
1
Reaction score
0
Hi,

I have been struggling with this problem for a while, and I have not found the answer in textbooks or google. Any help would be very much appreciated.

Suppose I know the singular value decomposition of matrix B, which is a singular, circulant matrix. That is, I know u_i, v_i, and \sigma_i, such that BB^*v_i = \sigma_i^2v_i and B^*Bu_i = \sigma_i^2u_i. Where B^* is the conjugate transpose.

Now let A = DB, where D is a diagonal matrix. Is there any way to determine the singular values and vectors of A from the singular values and vectors of B?

Thank you,
Jason
 
Physics news on Phys.org
You should take advantage of the special and desirable properties of circulant matrices, namely, use an eigenvalue decomposition (EVD) on B instead of a SVD. Every circulant nxn matrix B has the EVD

B=W\Lambda W^H

where \Lambda is a diagonal matrix of eigenvalues and where the columns of W contain the eigenvectors. W contains the complete basis set for the complex discrete Fourier transform of length n, regardless of details of B. Since

W^H = W^{-1}

it is easy to show that

B^{-1}=W\Lambda^{-1} W^H

and B is singular if has one or more zero eigenvalue. To expand on your result, note that

BB^H=W\Lambda \Lambda^HW^H=W|\Lambda|^2W^H

applied to one of the eigenvectors w_i gives

BB^Hw_i=|\lambda_i|^2w_i.

Your singlular values squared {\sigma_i}^2 are known to be the eigenvalues of BB^H, and comparison to the above shows that they are in fact the eigenvalues squared of B.

Multiplying B by a diagonal matrix D removes the circulant symmetry, and I don't see a simple relation between the expansion of A and that of B.
 
Last edited:
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
977
  • · Replies 1 ·
Replies
1
Views
3K