How can we construct the matrix S ?

  • MHB
  • Thread starter mathmari
  • Start date
  • Tags
    Matrix
In summary, the conversation discusses using a matrix $S$ for a block deflation process on a non-singular matrix $A$, where $SAS^{-1}$ has a specific block structure. It is mentioned that the eigenvalues of $A$ can be found by combining the eigenvalues of the blocks $B$ and $C$ in the block structure. The conversation also explores the construction of $S$ using linearly independent vectors and discusses the use of matrices $C$, $D$, and $B$ in the block structure. Finally, it is clarified that $S^{-1}$ is the matrix formed with the columns of $v_1, \ldots , v_n$ and that any additional vectors needed to complete the
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

At the block deflation it holds for a non-singular Matrix $S$ \begin{equation*}SAS^{-1}=\begin{pmatrix}C & D \\ O & B\end{pmatrix}\end{equation*} where $O$ is the zero matrix.

It holds that $\sigma (A)=\sigma(B)\cup \sigma (C)$, where $\sigma (M)$ is the set of all eigenvalues of a Matrix $M$.

Let $v_1, \ldots , v_m$ be linearly independent vectors such that $Av_j\in \text{span}\{v_1, \ldots , v_m\}, j=1,\ldots , m$.

I want to use these vectors to construct a matrix $S$, with which we can apply a $m$-column block deflation of $A$.



Could you give me a hint how we could construct the matrix $S$ ? (Wondering)
 
Mathematics news on Phys.org
  • #2
Hey mathmari! (Smile)

Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'
 
  • #3
I like Serena said:
Can't we apply the algorithm you've outlined http://mathhelpboards.com/advanced-applied-mathematics-16/approximation-eigenvalue-power-method-23336-post104379.html#post104379?
It even says 'Zur gleichzeitigen Abspaltung mehrere Eigenwerte benutzen wir die Block-Deflation.'

So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)
 
  • #4
mathmari said:
So, the matrix $S$ must be as the identity matrix, can in the first column there must be the eigenvector where each element is divided by the first element of that vector, with the minus sign. Or not? (Wondering)

Actually, I had a different idea.

Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)
 
  • #5
I like Serena said:
Suppose $m=2$ and we have 2 eigenvalues $\lambda_1$ and $\lambda_2$ with eigenvectors $\mathbf v_1,\mathbf v_2$.
Then:
$$
A (\mathbf v_1\ \mathbf v_2) = (\lambda_1\mathbf v_1\ \lambda_2\mathbf v_2) = (\mathbf v_1\ \mathbf v_2)\begin{pmatrix}\lambda_1&0\\0&\lambda_2 \end{pmatrix}
$$
where for instance $ (\mathbf v_1\ \mathbf v_2)$ is the matrix formed with the 2 vectors as columns. Yes? (Wondering)

Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?
I like Serena said:
Now suppose $m=2$ and $Av_j \in \text{Span}\{\mathbf v_1,\mathbf v_2\}$ for $j=1,2$.
Then we can write it similarly as:
$$
A (\mathbf v_1\ \mathbf v_2) = (\mathbf v_1\ \mathbf v_2)C
$$
can't we? (Wondering)

Why do we use here the matrix $C$ ? (Wondering)
I like Serena said:
Now suppose $A$ is a 3x3 matrix. Then we can find a vector $\mathbf w$, such that $\{v_1,v_2,w\}$ is a basis, can't we?
Then we can write:
$$
A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = (\mathbf v_1\ \mathbf v_2\ \mathbf w)\begin{pmatrix}C&D\\0&B\end{pmatrix}
$$
can't we? (Wondering)

Then:
$$(\mathbf v_1\ \mathbf v_2\ \mathbf w)^{-1} A (\mathbf v_1\ \mathbf v_2\ \mathbf w) = \begin{pmatrix}C&D\\0&B\end{pmatrix}$$
(Thinking)

Could you explain to me also this case further? (Wondering)
 
  • #6
mathmari said:
Do we consider at each case the vectors that are eigenvectors for the eigenvalues at each case?

Which cases do you mean? (Wondering)

Basically we're combining $A\mathbf v_1 = \lambda_1 \mathbf v_1$ and $A\mathbf v_2 = \lambda_2 \mathbf v_2$ into one formula by using matrices.
Btw, this is also how to prove that a diagonalizable matrix can be written as $V\Lambda V^{-1}$, where $\Lambda$ is a diagonal matrix with eigenvalues, and $V$ is the matrix of corresponding eigenvectors. (Nerd)
mathmari said:
Why do we use here the matrix $C$ ?

In this case the image of each vector $\mathbf v_i$ is a linear combination of $\mathbf v_1$ and $\mathbf v_2$.
The matrix $C$ is a 2x2 matrix that denotes these linear combinations. (Thinking)

mathmari said:
Could you explain to me also this case further? (Wondering)

We're adding a vector $\mathbf w$.
Its image will be a linear combination of all vectors $\mathbf v_1, \mathbf v_2, \mathbf w$.
The sub matrices $D$ and $B$ denote that this can be any such combination.
However, the images of $\mathbf v_1$ and $\mathbf v_2$ can still only be linear combinations of those same vectors.
Hence $C$ with a $0$ sub matrix below it.

And since $\{\mathbf v_1, \mathbf v_2, \mathbf w\}$ is a basis, the corresponding matrix is invertible.
When we multiply on the left by its inverse, we get the desired form. (Thinking)
 
  • #7
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea? (Wondering)
 
  • #8
mathmari said:
So, do we take at $S$ as columns the vectors $v_1, \ldots , v_n$ and the matrices $B, D, C$ denote the respective linear combination of the vectors?

Or haven't I understood correctly the idea?

More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)
 
  • #9
I like Serena said:
More accurately we take $S^{-1}$ as the matrix with $v_1, \ldots , v_n$ as columns.
And assuming $A$ is an nxn matrix, the as yet unspecified $v_{m+1},...,v_n$ complete the basis. (Nerd)

If that's what you intended, then yes, I think you understood the idea correctly. (Happy)

Ok! Thank you so much! (Mmm)
 

1. What is the purpose of constructing the matrix S?

The matrix S is used in linear algebra to represent a linear transformation from one vector space to another. It is also used in machine learning and data analysis for dimensionality reduction and feature extraction.

2. How do we determine the dimensions of the matrix S?

The dimensions of the matrix S are determined by the number of rows and columns in the matrix. The number of rows is equal to the number of output variables or features, and the number of columns is equal to the number of input variables or features.

3. What are the steps involved in constructing the matrix S?

The steps involved in constructing the matrix S depend on the specific application and the desired output. However, in general, the steps may include identifying the input and output variables, collecting data, normalizing the data, and performing any necessary transformations or feature engineering before constructing the matrix.

4. Can we construct the matrix S using any type of data?

The matrix S can be constructed using different types of data, such as numerical data, categorical data, and even textual data. However, the type of data may affect the specific methods and techniques used in constructing the matrix.

5. How can we evaluate the effectiveness of the matrix S?

The effectiveness of the matrix S can be evaluated by examining its ability to accurately represent the relationships between the input and output variables. This can be done through techniques such as error analysis, cross-validation, and comparing the performance of different models using the matrix S.

Similar threads

Replies
4
Views
2K
  • Math POTW for Graduate Students
Replies
1
Views
463
Replies
4
Views
2K
  • General Math
2
Replies
42
Views
3K
  • General Math
Replies
16
Views
3K
  • Linear and Abstract Algebra
Replies
6
Views
512
  • General Math
Replies
5
Views
2K
  • Precalculus Mathematics Homework Help
Replies
4
Views
956
  • Linear and Abstract Algebra
Replies
4
Views
985
  • Linear and Abstract Algebra
Replies
10
Views
1K
Back
Top