How to Derive U from L in O(n^2) Complexity?

  • Context: Graduate 
  • Thread starter Thread starter MRTRACKER
  • Start date Start date
  • Tags Tags
    Calculation Matrix
Click For Summary
SUMMARY

The discussion focuses on deriving an upper triangular matrix U from a lower triangular matrix L, ensuring that L'*L = U'*U, with a target computational complexity of O(n^2). The method proposed involves diagonalizing L'L using an orthogonal matrix M, leading to the relationship U = EM', where E contains the square roots of the eigenvalues of D. However, the participants acknowledge that achieving O(n^2) complexity is challenging, as most methods, including QR decomposition and eigendecomposition, typically operate at O(n^3). The consensus leans towards the conclusion that a faster algorithm than O(n^3) for this matrix factorization problem is unlikely.

PREREQUISITES
  • Understanding of lower and upper triangular matrices
  • Familiarity with matrix transposition and multiplication
  • Knowledge of eigendecomposition and its computational complexity
  • Basic concepts of QR decomposition and Cholesky factorization
NEXT STEPS
  • Research the properties of symmetric positive definite matrices
  • Explore advanced matrix factorization techniques beyond QR and Cholesky
  • Study the implications of eigendecomposition in computational complexity
  • Investigate potential O(n^2) algorithms for matrix factorization
USEFUL FOR

Mathematicians, data scientists, and software engineers involved in numerical analysis, particularly those working with matrix computations and optimizations in algorithms.

MRTRACKER
Messages
3
Reaction score
0
The problem is :

Given a lower triangular matrix L, how to calculate a upper triangular matrix U, such that L'*L=U'*U where L' and U' are transpose of matrix L and U respectively?
I am looking for a method which has less computational complexity, i.e., O(n^2) if the dimension of L is n.
 
Physics news on Phys.org
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)
 
Petr Mugver said:
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)

Thank u very much.
From your reply, I see the point is to derive M and D by eigendecomposing on L'L. In general , the computational complexity of eigendecompose is O(N^3). Do you have other ideas that could achieve O(N^2)?
The other thing is your method can not make sure U is an upper triangular matrix.

P.S., I knew the QR decomposition can solve this problem, but QR requires O(N^3) in MATLAB, i.e., [Q, U]=qr(L).
 
Last edited:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.
 
AlephZero said:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.

Thanks for your suggestion.
From this point, I might stop to looking for a O(n^2) solution.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
28
Views
2K
Replies
4
Views
2K