I'm trying to solve the following problem (not homework ) which is a strange form of diagonalization problem. Standard references and papers didn't turn up anything for me. Does anyone see possible approach for this? - Given n x n full rank random matrices A1, A2, ... A9 Find length n unitary vectors x1, x2, x3, y1, y2, and y3 such that [y1^H 0 0; 0 y2^H 0; 0 0 y3^H] [A1 A2 A3; A4 A5 A6; A7 A8 A9] [x1 0 0; 0 x2 0; 0 0 x3] reduces to a 3 x 3 diagonal matrix. ^H is the Hermitian transpose and 0's indicate appropriate zero vectors. It's like a constrained form of SVD but I can't seem to get a handle of it. Thanks in advance for any thoughts!