How can we construct the matrix S ?

  • Context: MHB 
  • Thread starter Thread starter mathmari
  • Start date Start date
  • Tags Tags
    Matrix
Click For Summary

Discussion Overview

The discussion revolves around the construction of a matrix \( S \) for block deflation in the context of eigenvalue problems. Participants explore the relationships between eigenvalues, eigenvectors, and the structure of the matrix \( S \) in relation to a matrix \( A \). The scope includes theoretical aspects of linear algebra and eigenvalue decomposition.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants propose using linearly independent vectors \( v_1, \ldots, v_m \) to construct the matrix \( S \) for block deflation.
  • Others suggest that the matrix \( S \) could be the identity matrix with modifications based on eigenvectors.
  • One participant discusses the case of two eigenvalues and their corresponding eigenvectors, questioning how to express the action of \( A \) on these vectors in matrix form.
  • Another participant raises the idea of extending the basis to include a third vector \( \mathbf{w} \) and how this affects the structure of the resulting matrix.
  • There are inquiries about the role of the matrix \( C \) in representing linear combinations of eigenvectors and how it fits into the block structure of the matrix \( S \).
  • Some participants clarify that the matrix \( S^{-1} \) should consist of the eigenvectors as columns, with additional unspecified vectors completing the basis.

Areas of Agreement / Disagreement

Participants express various viewpoints on the construction of the matrix \( S \) and its components, indicating that multiple competing views remain. There is no consensus on a single method or structure for \( S \).

Contextual Notes

Participants discuss the implications of using different eigenvectors and how they relate to the eigenvalues of the matrix \( A \). The discussion includes assumptions about the linear independence of vectors and the invertibility of the matrix formed by these vectors.

Who May Find This Useful

This discussion may be useful for students and practitioners in linear algebra, particularly those interested in eigenvalue problems and matrix decompositions.

mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

At the block deflation it holds for a non-singular Matrix $S$ \begin{equation*}SAS^{-1}=\begin{pmatrix}C & D \\ O & B\end{pmatrix}\end{equation*} where $O$ is the zero matrix.

It holds that $\sigma (A)=\sigma(B)\cup \sigma (C)$, where $\sigma (M)$ is the set of all eigenvalues of a Matrix $M$.

Let $v_1, \ldots , v_m$ be linearly independent vectors such that $Av_j\in \text{span}\{v_1, \ldots , v_m\}, j=1,\ldots , m$.

I want to use these vectors to construct a matrix $S$, with which we can apply a $m$-column block deflation of $A$.



Could you give me a hint how we could construct the matrix $S$ ? (Wondering)
 
Physics news on Phys.org
Hey mathmari! (Smile)

Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'
 
I like Serena said:
Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'

So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)
 
mathmari said:
So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)

Actually, I had a different idea.

Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)
 
I like Serena said:
Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?
I like Serena said:
Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Why do we use here the matrix $C$ ? (Wondering)
I like Serena said:
Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)

Could you explain to me also this case further? (Wondering)
 
mathmari said:
Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?

Which cases do you mean? (Wondering)

Basically we're combining $A\mathbf v_1 = \lambda_1 \mathbf v_1$ and $A\mathbf v_2 = \lambda_2 \mathbf v_2$ into one formula by using matrices.
Btw, this is also how to prove that a diagonalizable matrix can be written as $V\Lambda V^{-1}$, where $\Lambda$ is a diagonal matrix with eigenvalues, and $V$ is the matrix of corresponding eigenvectors. (Nerd)
mathmari said:
Why do we use here the matrix $C$ ?

In this case the image of each vector $\mathbf v_i$ is a linear combination of $\mathbf v_1$ and $\mathbf v_2$.
The matrix $C$ is a 2x2 matrix that denotes these linear combinations. (Thinking)

mathmari said:
Could you explain to me also this case further? (Wondering)

We're adding a vector $\mathbf w$.
Its image will be a linear combination of all vectors $\mathbf v_1, \mathbf v_2, \mathbf w$.
The sub matrices $D$ and $B$ denote that this can be any such combination.
However, the images of $\mathbf v_1$ and $\mathbf v_2$ can still only be linear combinations of those same vectors.
Hence $C$ with a $0$ sub matrix below it.

And since $\{\mathbf v_1, \mathbf v_2, \mathbf w\}$ is a basis, the corresponding matrix is invertible.
When we multiply on the left by its inverse, we get the desired form. (Thinking)
 
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea? (Wondering)
 
mathmari said:
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea?

More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)
 
I like Serena said:
More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)

Ok! Thank you so much! (Mmm)
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
2K
Replies
4
Views
1K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K