Number of multiplications and divisions for LU decomposition

  • Context: MHB 
  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Decomposition
Click For Summary

Discussion Overview

The discussion revolves around the computational complexity of obtaining an LU decomposition for an augmented matrix formed by adding a vector to an existing matrix with a known LU decomposition. Participants explore the number of multiplications and divisions required for this process, focusing on the theoretical aspects of matrix decomposition and algorithm complexity.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants suggest that the number of divisions needed for the LU decomposition of the augmented matrix is $n$, based on the elimination process.
  • Others argue that for each elimination step, $n$ multiplications are required, although some of these may involve multiplying by zero.
  • There is a proposal to express the new LU matrices in terms of the original matrices and the added vector, raising questions about the implications of the entries of the vector.
  • Participants discuss the potential conflict in notation when introducing new variables that may overlap with existing ones, particularly regarding the entries of the vector.
  • Some participants express uncertainty about the implications of certain entries being zero and how that affects the overall decomposition.
  • There is a suggestion that the complexity of the algorithm should be $O(n^2)$, but the exact reasoning and steps to demonstrate this remain under discussion.

Areas of Agreement / Disagreement

Participants generally agree on the need to calculate the number of operations involved in the LU decomposition but have differing views on the specifics of the calculations and the implications of certain assumptions. The discussion remains unresolved regarding the exact nature of the entries in the augmented matrix and their impact on the decomposition.

Contextual Notes

Limitations include the potential for confusion in variable notation and the need for clarity on the assumptions regarding the entries of the vector and their relationship to the LU decomposition.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! 😊

Let $A$ a $n\times n$ matrix with known LU decomposition, let $u\in \mathbb{R}^n, v\in \mathbb{R}^{n+1}$.

Show that the number of multiplications and divisions that are needed to get a LU decomposition of the $(n+1)\times (n+1)$ matrix $$\begin{pmatrix}A & u \\ v^T\end{pmatrix}$$ is at most $O(n^2)$.
To get $U$, we have to eliminate $n$ terms below the main diagonal (which is the elements of $u^T$ except the last element of that row). Each elimination requires computing the row multiplier, which involves division by the pivotal element.

So we have $n$ divisions, or not? :unsure:
 
Last edited by a moderator:
Physics news on Phys.org
mathmari said:
So we have $n$ divisions, or not?
Hey mathmari!

Yep.
That leaves counting the number of multiplications that we need for each elimination, doesn't it? 🤔
 
Klaas van Aarsen said:
That leaves counting the number of multiplications that we need for each elimination, doesn't it? 🤔

So for each element except the $(n+1)$th one of the last row we have to multiply the row multiplier with the respective row and subtract it from the last row, right?
Therefore we have to do $n$ multiplications, or not? :unsure:
 
mathmari said:
So for each element except the $(n+1)$th one of the last row we have to multiply the row multiplier with the respective row and subtract it from the last row, right?
Therefore we have to do $n$ multiplications, or not?
That sounds about right to find a single unknown entry, although a number of those multiplications are actually with zero.
Either way, for the worst case entry in row $n+1$ and column $n$ of the new $L$ matrix, we have indeed $n$ multiplications and $1$ division. 🤔

We still need to repeat for every entry in the extra row below the given L matrix, don't we?
And also for the extra column to the right of the given U matrix? 🤔
 
Klaas van Aarsen said:
That sounds about right to find a single unknown entry, although a number of those multiplications are actually with zero.
Either way, for the worst case entry in row $n+1$ and column $n$ of the new $L$ matrix, we have indeed $n$ multiplications and $1$ division. 🤔

We still need to repeat for every entry in the extra row below the given L matrix, don't we?
And also for the extra column to the right of the given U matrix? 🤔

I got stuk right now. Could you explain that further to me? I haven't really understood that. :unsure:
 
mathmari said:
I got stuk right now. Could you explain that further to me? I haven't really understood that.
We want to find that the order of complexity is $O(n^2)$ don't we?
To find that, we need an algorithm to find the LU-decomposition of the matrix, don't we? 🤔

Let $A=L\cdot U$ be the given LU-decomposition of $A$.
Let $\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$ be the matrix to decompose.
And let $\tilde L,\,\tilde U$ be its LU-decomposition.

Can we write $\tilde L,\,\tilde U$ in terms of $L$, $U$, and $u$? (Wondering)
 
Klaas van Aarsen said:
We want to find that the order of complexity is $O(n^2)$ don't we?
To find that, we need an algorithm to find the LU-decomposition of the matrix, don't we? 🤔

Let $A=L\cdot U$ be the given LU-decomposition of $A$.
Let $\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$ be the matrix to decompose.
And let $\tilde L,\,\tilde U$ be its LU-decomposition.

Can we write $\tilde L,\,\tilde U$ in terms of $L$, $U$, and $u$? (Wondering)

We have that $$\tilde{L}=\begin{pmatrix}L\\ 0 & 0 & \ldots & 0& \ell\end{pmatrix}$$ with $\ell>0$, or not? :unsure:
 
mathmari said:
We have that $$\tilde{L}=\begin{pmatrix}L\\ 0 & 0 & \ldots & 0& \ell\end{pmatrix}$$ with $\ell>0$, or not?
I don't think so. (Shake)
If it would be, wouldn't we get that $u_1$ is always zero? (Worried)
 
Klaas van Aarsen said:
I don't think so. (Shake)
If it would be, wouldn't we get that $u_1$ is always zero? (Worried)

I got stuck right now. Why is $u_1$ always zero then? :unsure:
 
  • #10
mathmari said:
I got stuck right now. Why is $u_1$ always zero then?

What do we get if we calculate:
$$\tilde{L} \cdot \tilde U =\begin{pmatrix}
& \\ & \\ L\\ 0 & \ldots & 0& \ell
\end{pmatrix}\cdot \begin{pmatrix}
&&U& * \\ &&& \vdots \\ &&& *
\\ 0 & \ldots & 0& *
\end{pmatrix}$$
?
In particular for the entry in the bottom left corner? (Wondering)

Can it be equal to
$$\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$$
with $u_1\ne 0$? 🤔
 
  • #11
Klaas van Aarsen said:
What do we get if we calculate:
$$\tilde{L} \cdot \tilde U =\begin{pmatrix}
& \\ & \\ L\\ 0 & \ldots & 0& \ell
\end{pmatrix}\cdot \begin{pmatrix}
&&U& * \\ &&& \vdots \\ &&& *
\\ 0 & \ldots & 0& *
\end{pmatrix}$$
?
In particular for the entry in the bottom left corner? (Wondering)

Can it be equal to
$$\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$$
with $u_1\ne 0$? 🤔

We have the following, or not?

$$\tilde{L} \cdot \tilde U =\begin{pmatrix}
& & & 0\\ & & & 0\\ L & & & \vdots\\ \ell_1 & \ldots & \ell_n& \ell_{n+1}
\end{pmatrix}\cdot \begin{pmatrix}
&&U& u_1 \\ &&& \vdots \\ &&& u_n
\\ 0 & \ldots & 0& u_{n+1}
\end{pmatrix}=\begin{pmatrix}
&&A& \ell_{11}u_1 \\ &&& \vdots \\ &&& \displaystyle{\sum_{i=1}^n\ell_{ni}u_i }
\\ \ell_1u_{11} & \ldots & \displaystyle{\sum_{i=1}^n\ell_iu_{in}} & \displaystyle{\sum_{i=1}^{n+1}\ell_iu_i}
\end{pmatrix}$$
where $L=(\ell_{ij})_{i,j\in \{1, 2, \ldots , n\}}$ and $U=(u_{ij})_{i,j\in \{1, 2, \ldots , n\}}$.

:unsure:
 
  • #12
mathmari said:
We have the following, or not?

$$\tilde{L} \cdot \tilde U =\begin{pmatrix}
& & & 0\\ & & & 0\\ L & & & \vdots\\ \ell_1 & \ldots & \ell_n& \ell_{n+1}
\end{pmatrix}\cdot \begin{pmatrix}
&&U& u_1 \\ &&& \vdots \\ &&& u_n
\\ 0 & \ldots & 0& u_{n+1}
\end{pmatrix}=\begin{pmatrix}
&&A& \ell_{11}u_1 \\ &&& \vdots \\ &&& \displaystyle{\sum_{i=1}^n\ell_{ni}u_i }
\\ \ell_1u_{11} & \ldots & \displaystyle{\sum_{i=1}^n\ell_iu_{in}} & \displaystyle{\sum_{i=1}^{n+1}\ell_iu_i}
\end{pmatrix}$$
where $L=(\ell_{ij})_{i,j\in \{1, 2, \ldots , n\}}$ and $U=(u_{ij})_{i,j\in \{1, 2, \ldots , n\}}$.
Yep. (Nod)

It's a bit problematic to introduce $u_1,\ldots,u_{n+1}$ here though.
Doesn't that clash with the given vector $u$? (Worried)

Anyway, we would get that for instance $\ell_1 u_{11}$ must be equal to the first entry of the vector $u$, mustn't it?
If that first entry of $u$ is non-zero, then $\ell_1$ cannot be zero can it? 🤔
 
  • #13
Klaas van Aarsen said:
It's a bit problematic to introduce $u_1,\ldots,u_{n+1}$ here though.
Doesn't that clash with the given vector $u$? (Worried)

Ok, it is better to use other letters.
Klaas van Aarsen said:
Anyway, we would get that for instance $\ell_1 u_{11}$ must be equal to the first entry of the vector $u$, mustn't it?
If that first entry of $u$ is non-zero, then $\ell_1$ cannot be zero can it? 🤔

So have:
$$\tilde{L} \cdot \tilde U =\begin{pmatrix}
& & & 0\\ & & & 0\\ L & & & \vdots\\ \alpha_1 & \ldots & \alpha_n& \alpha_{n+1}
\end{pmatrix}\cdot \begin{pmatrix}
&&U& \beta_1 \\ &&& \vdots \\ &&& \beta_n
\\ 0 & \ldots & 0& \beta_{n+1}
\end{pmatrix}=\begin{pmatrix}
&&A& \ell_{11}\beta_1 \\ &&& \vdots \\ &&& \displaystyle{\sum_{i=1}^n\ell_{ni}\beta_i }
\\ \alpha_1u_{11} & \ldots & \displaystyle{\sum_{i=1}^n\alpha_iu_{in}} & \displaystyle{\sum_{i=1}^{n+1}\alpha_i\beta_i}
\end{pmatrix}$$
where $L=(\ell_{ij})_{i,j\in \{1, 2, \ldots , n\}}$ and $U=(u_{ij})_{i,j\in \{1, 2, \ldots , n\}}$.

The result must be of the form $\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$, so it must hold \begin{align*}&\ell_{11}\beta_1=\alpha_1u_{11} \\ &\ldots \\ & \displaystyle{\sum_{i=1}^n\ell_{ni}\beta_i }=\displaystyle{\sum_{i=1}^n\alpha_iu_{in}} \end{align*}

What do we get from that? :unsure:
 
  • #14
mathmari said:
The result must be of the form $\tilde A=\begin{pmatrix}A & u \\ u^T\end{pmatrix}$, so it must hold \begin{align*}&\ell_{11}\beta_1=\alpha_1u_{11} \\ &\ldots\end{align*}

What do we get from that?

From the first equation we get $$\ell_{11}\beta_1=\alpha_1u_{11} = u_1 \implies \begin{cases}\alpha_1 = \frac {u_1}{u_{11}} \\ \beta_1=\frac{u_1}{\ell_{11}}\end{cases}$$ don't we? 🤔
 
  • #15
I have just noticed a vector $v$ in the problem statement:
mathmari said:
Let $A$ a $n\times n$ matrix with known LU decomposition, let $u\in \mathbb{R}^n, v\in \mathbb{R}^{n+1}$.

Show that the number of multiplications and divisions that are needed to get a LU decomposition of the $(n+1)\times (n+1)$ matrix $$\begin{pmatrix}A & u \\ u^T\end{pmatrix}$$ is at most $O(n^2)$.

Is there a typo?
Shouldn't that $v$ be used somewhere? (Worried)
 
  • #16
Klaas van Aarsen said:
I have just noticed a vector $v$ in the problem statement:Is there a typo?
Shouldn't that $v$ be used somewhere? (Worried)

Ahh yes, it is a typo. :oops: It should be:
$$\begin{pmatrix}A & u \\ v^T\end{pmatrix}$$

So from $$
\tilde{L} \cdot \tilde U =\begin{pmatrix}
& & & 0\\ & & & 0\\ L & & & \vdots\\ \alpha_1 & \ldots & \alpha_n& \alpha_{n+1}
\end{pmatrix}\cdot \begin{pmatrix}
&&U& \beta_1 \\ &&& \vdots \\ &&& \beta_n
\\ 0 & \ldots & 0& \beta_{n+1}
\end{pmatrix}=\begin{pmatrix}
&&A& \ell_{11}\beta_1 \\ &&& \vdots \\ &&& \displaystyle{\sum_{i=1}^n\ell_{ni}\beta_i }
\\ \alpha_1u_{11} & \ldots & \displaystyle{\sum_{i=1}^n\alpha_iu_{in}} & \displaystyle{\sum_{i=1}^{n+1}\alpha_i\beta_i}
\end{pmatrix}$$ what do we get? :unsure:
 
  • #17
We can deduce the values of the $\alpha_i$ and $\beta_i$ can't we?
Once we have those, we have the LU-decomposition of $\tilde A$. 🤔
 
  • #18
Klaas van Aarsen said:
We can deduce the values of the $\alpha_i$ and $\beta_i$ can't we?
Once we have those, we have the LU-decomposition of $\tilde A$. 🤔

Let $v=(v_1, \ldots , v_{n+1})$ and $u=(x_1, \ldots, x_{n+1})$.

We have that \begin{align*}&\alpha_1u_{11}=v_1 \Rightarrow \alpha_1=\frac{v_1}{u_{11}} \\ &\ldots \\ &\sum_{i=1}^n\alpha_iu_{in}=v_n \Rightarrow \alpha_n=\frac{v_n-\sum_{i=1}^{n-1}\alpha_iu_{in}}{u_{nn}}\\ &\sum_{i=1}^{n+1}\alpha_i\beta_i=v_{n+1}=x_{n+1} \Rightarrow a_{n+1}\beta_{n+1}=v_{n+1}-\sum_{i=1}^{n}\alpha_i\beta_i \ \text{ and } \ a_{n+1}\beta_{n+1}=x_{n+1}-\sum_{i=1}^{n}\alpha_i\beta_i\\ &\ell_{11}\beta_1=x_1 \Rightarrow \beta_i=\frac{x_1}{\ell_{11}} \\ &\ldots \\ &\sum_{i=1}^n\ell_{ni}\beta_i=x_n \Rightarrow \beta_n=\frac{x_n-\sum_{i=1}^{n-1}\beta_i\ell_{ni}}{\ell_{nn}}\end{align*}

Is everything correct? :unsure:
 
Last edited by a moderator:
  • #19
I see an $u_n$ that I think should be a $v_n$.
And I also see an $u_{n+1}$ that should be a $v_{n+1}$. (Worried)

Other than that it looks good to me! (Nod)
 
  • #20
Klaas van Aarsen said:
I see an $u_n$ that I think should be a $v_n$.
And I also see an $u_{n+1}$ that should be a $v_{n+1}$. (Worried)

Other than that it looks good to me! (Nod)

I changed it!

So in total we do $\displaystyle{2\cdot \sum_{i=1}^ni=2\cdot \frac{n(n+1)}{2}=n(n+1)}$ mutliplications and $\displaystyle{2n}$ divisions, or not? :unsure:
 
  • #21
mathmari said:
I changed it!

So in total we do $\displaystyle{2\cdot \sum_{i=1}^ni=2\cdot \frac{n(n+1)}{2}=n(n+1)}$ mutliplications and $\displaystyle{2n}$ divisions, or not?
Yep. (Nod)
 
  • #22
Klaas van Aarsen said:
Yep. (Nod)

Great! Is this a formal proof to write the matrices and the formulas that we get? Or do we have to do also something else/more? :unsure:
 
  • #23
mathmari said:
Great! Is this a formal proof to write the matrices and the formulas that we get? Or do we have to do also something else/more?
We have identified an algorithm with order $O(n^2)$ that solves the problem.
So that qualifies as a proof that the order of the desired LU-decomposition is at most $O(n^2)$, doesn't it? 🤔
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 52 ·
2
Replies
52
Views
4K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K