How to Derive U from L in O(n^2) Complexity?

  • Context: Graduate 
  • Thread starter Thread starter MRTRACKER
  • Start date Start date
  • Tags Tags
    Calculation Matrix
Click For Summary

Discussion Overview

The discussion revolves around the problem of deriving an upper triangular matrix U from a given lower triangular matrix L, under the condition that L'*L=U'*U, where L' and U' are the transposes of matrices L and U respectively. Participants are exploring methods to achieve this with a computational complexity of O(n^2).

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant suggests that L'L is symmetric positive definite and can be diagonalized using an orthogonal matrix M, leading to a proposed method for deriving U.
  • Another participant points out that the computational complexity of eigendecomposition is O(n^3) and asks for alternative methods that could achieve O(n^2).
  • A different approach is proposed, likening the process to the Cholesky factorization, where participants suggest working backwards from the last term to derive elements of U.
  • Some participants express skepticism about achieving O(n^2) complexity, suggesting that the fastest algorithm is likely O(n^3) based on the nature of matrix factorization.

Areas of Agreement / Disagreement

Participants do not reach a consensus on a method that achieves O(n^2) complexity. There are competing views regarding the feasibility of such a solution, with some suggesting that the problem inherently requires O(n^3) complexity.

Contextual Notes

Participants note limitations regarding the assurance that U remains an upper triangular matrix and the computational complexity of proposed methods, particularly the reliance on eigendecomposition and Cholesky factorization.

MRTRACKER
Messages
3
Reaction score
0
The problem is :

Given a lower triangular matrix L, how to calculate a upper triangular matrix U, such that L'*L=U'*U where L' and U' are transpose of matrix L and U respectively?
I am looking for a method which has less computational complexity, i.e., O(n^2) if the dimension of L is n.
 
Physics news on Phys.org
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)
 
Petr Mugver said:
L'L is symmetric positive definite, so it can be diagonalized with an orthogonal matrix M
(that is, M'L'LM = D where M^-1 = M' and D is diagonal with positive eigenvalues). So:

L'L = U'U
MDM' = U'U
D = M'U'UM = (UM)'(UM)

so, just put

UM = E
U = EM'

where E is the diagonal matrix whose eigenvalues are the square roots of the corresponding eigenvalues of D.

Hope it's clear :)

Thank u very much.
From your reply, I see the point is to derive M and D by eigendecomposing on L'L. In general , the computational complexity of eigendecompose is O(N^3). Do you have other ideas that could achieve O(N^2)?
The other thing is your method can not make sure U is an upper triangular matrix.

P.S., I knew the QR decomposition can solve this problem, but QR requires O(N^3) in MATLAB, i.e., [Q, U]=qr(L).
 
Last edited:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.
 
AlephZero said:
You can do this in a similar way to the standard Cholesky factorization of a matrix, except you start with the last term and work backwards.

Let A = L' L = U' U and multiply out the two products.

The last element A_{n,n}= U_{n,n}^2 so you can find U_{n,n} in terms of the elements of L.

The element A_{n-1,n} depends only on U_{n-1,n} and U_{n,n}, so you can use it to find U_{n-1,n}.

You work backwards along the last row to A_{1,n}, then work along the next row starting with A_{n-1,n-1}, and finish at A_{1,1}

If you haven't seen this before, it might help to write out the matrix products for 3x3 or 4x4 matrices in full.

This algorithm suggests very strongly that the fastest algorithm is O(n^3), because if there was a faster algorithm, the same idea could be used for any matrix factorization.

Thanks for your suggestion.
From this point, I might stop to looking for a O(n^2) solution.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
28
Views
2K
Replies
4
Views
2K