Example Required: Matrix Solution By Dividing into Quadrants

  • Thread starter Thread starter zak100
  • Start date Start date
  • Tags Tags
    Example Matrix
Click For Summary

Homework Help Overview

The discussion revolves around solving larger matrices by dividing them into quadrants, specifically exploring the applicability of this method to Gauss elimination and matrix multiplication.

Discussion Character

  • Exploratory, Assumption checking, Conceptual clarification

Approaches and Questions Raised

  • Participants discuss the feasibility of using quadrant division for matrix multiplication and Gauss elimination, questioning how to evaluate and merge results from submatrices. There is an exploration of whether smaller matrices can be processed in parallel to speed up computations.

Discussion Status

Some participants have provided insights into the block matrix multiplication approach, while others express confusion regarding the applicability of this method to Gauss elimination. There is an acknowledgment of differing opinions on the effectiveness of the quadrant method for various matrix operations.

Contextual Notes

Participants note constraints related to the size of matrices and the requirement for certain matrices to be invertible for Gauss elimination, which may limit the quadrant approach's effectiveness in that context.

zak100
Messages
462
Reaction score
11

Homework Statement


Hi,
I am looking for an example to solve a larger Matrix by dividing into Quadrant. Is it possible for Gauss Elimination or Matrix Multiplication.

Homework Equations


No equation possible

The Attempt at a Solution


Looking for a example

Zulfi.
 
Physics news on Phys.org
You can treat so called blocks the same way as single entries, except they do not commute. If ##A,B,C,D,A',B',C',D'## are square matrices, then
$$
\begin{bmatrix}A&B\\C&D\end{bmatrix}\cdot \begin{bmatrix}A'&B'\\C'&D'\end{bmatrix} = \begin{bmatrix}AA'+BC'&AB'+BD'\\CA'+DC'&CB'+DD'\end{bmatrix}
$$
This is especially helpful if ##C=C'=0##.
 
Hi,
Thanks for your response. Suppose I have a 4* 4 matrix. I can form quadrants of 2* 2. Then how can i evaluate the matrix multiplication using these quadrants, and how to merge the results of these sub matrices to evaluate the computation of the entire 4*4 matrix.

Zulfi.
 
zak100 said:
Hi,
Thanks for your response. Suppose I have a 4* 4 matrix. I can form quadrants of 2* 2. Then how can i evaluate the matrix multiplication using these quadrants, and how to merge the results of these sub matrices to evaluate the computation of the entire 4*4 matrix.
@fresh_42 gave you the complete method. What is not clear to you?
 
Hi,
To some extent it is correct but not complete. Also fresh_42 started from a smaller thing. I thought he would start from a larger Matrix. This confused me. Any way thanks for you guys, good work.

1) If I have a 4 * 4 Matrix, I can create 4 matrices of 2 * 2. If I have a 5 * 5 matrices, can add zeros to compensate for the lost rows and columns.

2) I have a feeling that this method won't work for Gauss Elimination. So is there any way to cut down the Matrix & to process the smaller matrices in parallel in case of Gauss Elimination Method. Again suppose the 4 * 4 matrices which I said can have 4 submatrices of 2 * 2. So if I process these submatrices individually i.e apply the Gauss Elimination method, will it work? If not how can i do it.

I want to cut down the matrix into smaller submatrices and then to process these submatrices in parallel to have a fast evaluation of Gauss Elimination method. Is there any way to speed up the Gauss Elimination method.

Zulfi.
 
zak100 said:
Also fresh_42 started from a smaller thing.
Not really: just take your ##4\times 4## matrix ##X## and write it ##X=\begin{bmatrix}A&B\\C&D\end{bmatrix}##.

Gauß elimination is to transform a matrix into ##SXS^{-1}##. With ##S=\begin{bmatrix}A'&B'\\C'&D'\end{bmatrix}## you have the desired formula. You can first deal with ##A##, then ##C\, , \,D\, , \,B##. It's done this way anyway, even on ##X##.

The main problem are the blocks on the opposite diagonal ##B\, , \,C## which prevent a blockwise procedure. So it's probably best to start with ##C## and eliminate it:

  1. Find ##T## such that ##TXT^{-1}=T\begin{bmatrix}A&B\\C&D\end{bmatrix}T^{-1}=\begin{bmatrix}A'&B'\\0&D'\end{bmatrix}##
  2. Find ##S_1## such that ##S_1A'S_1^{-1} = A''##.
  3. Find ##S_2## such that ##S_2D'S_2^{-1} = D''##.
  4. ##\begin{bmatrix}A''&B''\\0&D''\end{bmatrix} = \begin{bmatrix} S_1 & 0 \\ 0 & S_2 \end{bmatrix} T X T^{-1} \begin{bmatrix} S_1^{-1} & 0 \\ 0 & S_2^{-1} \end{bmatrix} = \begin{bmatrix} S_1A' S_1^{-1} & S_1B'S_2^{-1} \\ 0 & S_2D' S_2^{-1} \end{bmatrix}##
 
Keep in mind that the example @fresh_42 gave has matrices for A, B, C, and D. So it can be very large. Each matrix represents an entire quadrant that you first asked about.
 
FactChecker said:
Keep in mind that the example @fresh_42 gave has matrices for A, B, C, and D. So it can be very large. Each matrix represents an entire quadrant that you first asked about.
Hi,
Thanks. You are right. I can't understand message 2 earlier but while I wrote reply#5, I got some idea as I was able to read the term "square matrices". I would try even though my professor says that the quadrant approach would work for matrix multiplication but not for gauss Elimination.

Thanks for your replies.

Zulfi.
 
zak100 said:
my professor says that the quadrant approach would work for matrix multiplication but not for gauss Elimination.
Mimicking the Gaussian elimination method would require that certain matrices be invertible. Matrix multiplication does not require that.
 

Similar threads

  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K