How to Derive U from L in O(n^2) Complexity?

  • Thread starter Thread starter MRTRACKER
  • Start date Start date
  • Tags Tags
    Calculation Matrix
MRTRACKER
Messages
3
Reaction score
0
The problem is :

Given a lower triangular matrix L, how to calculate a upper triangular matrix U, such that L'*L=U'*U where L' and U' are transpose of matrix L and U respectively?
I am looking for a method which has less computational complexity, i.e., O(n^2) if the dimension of L is n.
 
Physics news on Phys.org
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)
 
Petr Mugver said:
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)

Thank u very much.
From your reply, I see the point is to derive M and D by eigendecomposing on L'L. In general , the computational complexity of eigendecompose is O(N^3). Do you have other ideas that could achieve O(N^2)?
The other thing is your method can not make sure U is an upper triangular matrix.

P.S., I knew the QR decomposition can solve this problem, but QR requires O(N^3) in MATLAB, i.e., [Q, U]=qr(L).
 
Last edited:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.
 
AlephZero said:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.

Thanks for your suggestion.
From this point, I might stop to looking for a O(n^2) solution.
 
Back
Top