Optimizing Rank p Matrix V for Symmetric Matrices with SVD Using Frobenius Norm

  • Thread starter Thread starter doodle
  • Start date Start date
  • Tags Tags
    Minimization
doodle
Messages
155
Reaction score
0
I have this matrix problem:

Given R_1, R_2, R_3\in\mathbb{R}^{N\times N} are symmetric matrices with rank p<N. Their SVD are U_1\Sigma_1 U_1^T, U_2\Sigma_2 U_2^T and U_3\Sigma_3 U_3^T, respectively. I want to find a rank p matrix V such that

J = \|V\Sigma_1 V^T - U_1\Sigma_1 U_1^T\|_F^2 + \|V\Sigma_2 V^T - U_2\Sigma_2 U_2^T\|_F^2 + \|V\Sigma_3 V^T - U_3\Sigma_3 U_3^T\|_F^2

is minimized, subject to the constraint V^T V = I.

I tried using the trace for the Frobenius norm and ended up with

2V (\Sigma_1^2 + \Sigma_2^2 + \Sigma_3^2) - 4(U_1\Sigma_1 U_1^T V \Sigma_1 + U_2\Sigma_2 U_2^T V \Sigma_2 + U_3\Sigma_3 U_3^T V \Sigma_3) + V(\Lambda + \Lambda^T) = 0

where \Lambda contains the Lagrange multipliers. I have no idea how to continue from here. Any help would be appreciated.
 
Last edited:
Physics news on Phys.org
I take it that there is no simple solution here?

In the case where p = 1, the solution for V (when I tried to work it out) is the eigenvector corresponding to the largest eigenvalue of

\Sigma_1 R_1 + \Sigma_2 R_2 + \Sigma_3 R_3
 
Last edited:

Similar threads

Replies
28
Views
6K
4
Replies
175
Views
25K
Replies
20
Views
6K
Replies
3
Views
2K
Replies
4
Views
4K
Back
Top