Why Does Matrix B & C Result in Matrix A?

  • Thread starter SmilingDave
  • Start date
  • Tags
    Work
In summary, the matrix C is a reduced row echelon form of the matrix A, which has the same rows as A but with zero rows underneath.
  • #1
SmilingDave
6
0
Cullen in a question seems to say this;

You have a matrix A of any size. Make a matrix B consisting of only the linearly independent columns of A, choosing them by going from left to right through the columns of A. Then make a matrix C such that BC=A [The q was to prove this is always possible. But I'm not asking about that]

He then says that the rows of C are exactly the rows of A in row reduced echelon form. I tried an example, and it works.

My q is, why is this so? I assume it's not just coincidence, so what is going on?
 
Physics news on Phys.org
  • #2
Because the columns of B are linearly independent.
i.e. what does it mean to reduce to row echelon form?
 
  • #3
To have the rows interact until they produce a basis for the row space of A.

What I don't see is how did C get to have exactly those same rows. I mean, C was introduced into the picture to have its columns operate on the columns in B so as to produce from B the whole of A. Its rows were never given a thought and didn't play a part. So how did they magically become the rows of A in RREF?
 
  • #4
Are the rows, then, not related to the columns?
 
  • #5
What I know is that the rank of the row space is always equal to the rank of the column space, but I don't know of any other connection between them. The bases can be very different. For example, if A is 3x5, the rows are elements of a five dimensional space, and the columns of a three dimensional one.
 
  • #6
SmilingDave said:
Cullen in a question seems to say this;

You have a matrix A of any size. Make a matrix B consisting of only the linearly independent columns of A, choosing them by going from left to right through the columns of A. Then make a matrix C such that BC=A [The q was to prove this is always possible. But I'm not asking about that]

He then says that the rows of C are exactly the rows of A in row reduced echelon form. I tried an example, and it works.

My q is, why is this so? I assume it's not just coincidence, so what is going on?
Something is wrong here. If A has type m x n, and rank(A)=k<m, then B has type m x k, so if C satifies BC=A, then C has type k x n, which is not the same as the type of A. But then, C is not the reduced row echolon form of A, sice A and C have different types.
 
  • #7
Erland,

The RREF of A will have a bunch of zero rows underneath the rows of C [which are all nonzero].
 
  • #8
The other way of seeing this is that you take your matrix A, row reduce it (to reduced row echelon form), call this C, then find the matrix B: BC=A.

Perhaps if you picked a simple case and treated it as a system of simultaneous equations it would help you see what is happening at each stage: what you are doing to the equations.

Note: when you row-reduce A like that, you are finding a particular basis, not just any old basis.
It should come as no surprise that this basis should have a special relationship with A.
 
  • #9
I see some light. Let R be the matrix that represents all the row operations used to row reduce A. Then RA=C

Thus A=BC, where B is R-inverse.

Let A have rank r.

Since C is also of rank r, it is just rows of zeros beyond row r, and thus it will only use the first r columns of B to build A from B.

So we can get the same effect, meaning get A, by using a matrix consisting of just the first r columns of B, and multiplying by the first r rows of C, and we will get A.

[If A is mxn, modified B will be mxr, and modified C will be rxn, so the dimension are OK].

But I'm still a bit in the dark. We have shown that there is a decomposition of A into R-inverse [modified] and C [modified].

Also, that there is a decomposition using the first lin ind columns of A [call it A modified] and some other matrix, call it D. So that A= R-inverse-modified times C-modified and also = A-modified times D.

But maybe A has two decompositions, using completely different matrices. How do we know A-modified equals R-inverse modified, and that C-modified equals D?

Still working on it. Will try your suggestion of a concrete example.

Thanks for all the help.
 
  • #10
Are there more than one decomposition that fits all the conditions?
Have you covered LU and LDU decomposition yet?
 

1. How are matrices B and C related to matrix A?

Matrix B and C result in matrix A through the process of matrix multiplication. This means that each element in matrix A is a combination of elements from matrix B and C, according to a specific set of rules and operations.

2. Can any two matrices result in the same matrix A?

No, for matrix B and C to result in matrix A, they must have the same dimensions (number of rows and columns) and the number of columns in matrix B must equal the number of rows in matrix C. This is a fundamental rule of matrix multiplication.

3. What is the significance of matrix A in relation to matrices B and C?

Matrix A represents the end result of multiplying matrices B and C. It is a unique matrix that is obtained by following the rules of matrix multiplication. Matrix A can tell us important information about the relationship between matrices B and C, such as whether they are inverses of each other or if they have a special property.

4. Can the order of multiplication between matrices B and C be switched?

No, matrix multiplication is not commutative, meaning that the order of multiplication matters. In other words, if you switch the order of matrices B and C, you will not get the same result as when you multiply them in their original order.

5. How is matrix A affected by the individual elements in matrices B and C?

The individual elements in matrices B and C determine the values of the elements in matrix A. Each element in matrix A is calculated based on the corresponding elements in matrices B and C, using specific mathematical operations. Therefore, any changes to the elements in matrices B and C will also affect the elements in matrix A.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
868
  • Linear and Abstract Algebra
Replies
9
Views
4K
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
25
Views
979
  • Linear and Abstract Algebra
Replies
4
Views
1K
Back
Top