MHB How can we construct the matrix S ?

  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Matrix
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

At the block deflation it holds for a non-singular Matrix $S$ \begin{equation*}SAS^{-1}=\begin{pmatrix}C & D \\ O & B\end{pmatrix}\end{equation*} where $O$ is the zero matrix.

It holds that $\sigma (A)=\sigma(B)\cup \sigma (C)$, where $\sigma (M)$ is the set of all eigenvalues of a Matrix $M$.

Let $v_1, \ldots , v_m$ be linearly independent vectors such that $Av_j\in \text{span}\{v_1, \ldots , v_m\}, j=1,\ldots , m$.

I want to use these vectors to construct a matrix $S$, with which we can apply a $m$-column block deflation of $A$.



Could you give me a hint how we could construct the matrix $S$ ? (Wondering)
 
Mathematics news on Phys.org
Hey mathmari! (Smile)

Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'
 
I like Serena said:
Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'

So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)
 
mathmari said:
So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)

Actually, I had a different idea.

Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)
 
I like Serena said:
Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?
I like Serena said:
Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Why do we use here the matrix $C$ ? (Wondering)
I like Serena said:
Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)

Could you explain to me also this case further? (Wondering)
 
mathmari said:
Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?

Which cases do you mean? (Wondering)

Basically we're combining $A\mathbf v_1 = \lambda_1 \mathbf v_1$ and $A\mathbf v_2 = \lambda_2 \mathbf v_2$ into one formula by using matrices.
Btw, this is also how to prove that a diagonalizable matrix can be written as $V\Lambda V^{-1}$, where $\Lambda$ is a diagonal matrix with eigenvalues, and $V$ is the matrix of corresponding eigenvectors. (Nerd)
mathmari said:
Why do we use here the matrix $C$ ?

In this case the image of each vector $\mathbf v_i$ is a linear combination of $\mathbf v_1$ and $\mathbf v_2$.
The matrix $C$ is a 2x2 matrix that denotes these linear combinations. (Thinking)

mathmari said:
Could you explain to me also this case further? (Wondering)

We're adding a vector $\mathbf w$.
Its image will be a linear combination of all vectors $\mathbf v_1, \mathbf v_2, \mathbf w$.
The sub matrices $D$ and $B$ denote that this can be any such combination.
However, the images of $\mathbf v_1$ and $\mathbf v_2$ can still only be linear combinations of those same vectors.
Hence $C$ with a $0$ sub matrix below it.

And since $\{\mathbf v_1, \mathbf v_2, \mathbf w\}$ is a basis, the corresponding matrix is invertible.
When we multiply on the left by its inverse, we get the desired form. (Thinking)
 
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea? (Wondering)
 
mathmari said:
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea?

More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)
 
I like Serena said:
More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)

Ok! Thank you so much! (Mmm)
 

Similar threads

Replies
4
Views
1K
Replies
4
Views
1K
Replies
16
Views
4K
Replies
5
Views
2K
Replies
42
Views
4K
Replies
3
Views
992
Replies
1
Views
2K
Back
Top