# Transformations to both sides of a matrix equation

• vcsharp2003
In summary, the book says we don't need to apply the same row operation to the whole matrix AB on RHS, but instead only to A.

#### vcsharp2003

Homework Statement
I am unable to see why elementary row operations or elementary column operations can be done to both sides of a matrix equation in the manner explained in my textbook. It doesn't make sense to me. In my textbook it says the following.

"Let X, A and B be matrices of, the same order such that X = AB. In order to apply a sequence of elementary row operations on the matrix equation X = AB, we will apply these row operations simultaneously on X and on the first matrix A of the product AB on RHS.

Similarly, in order to apply a sequence of elementary column operations on the matrix equation X = AB, we will apply, these operations simultaneously on X and on the second matrix B of the product AB on RHS."
Relevant Equations
None
I feel if we have the matrix equation X = AB, where X,A and B are matrices of the same order, then if we apply an elementary row operation to X on LHS, then we must apply the same elementary row operation to the matrix C = AB on the RHS and this makes sense to me. But the book says, that we don't need to apply the same row operation to the whole matrix AB on RHS, but instead only to A. I am at a complete loss to explain this.

An example given in the book is as below. My problem is why the mentioned row operation is only applied to the first matrix on RHS and not to the whole matrix resulting from multiplication of two matrices on RHS. According to simple logic, the same row operation used on LHS should have been used for the single matrix that is the result of IA i.e. for matrix C = IA, where I is the identity matrix.

Last edited:
Matrix multiplication is associative: $Z(AB) = (ZA)B$ and $(AB)Z = A(BZ)$.

FactChecker and vcsharp2003
pasmith said:
Matrix multiplication is associative: $Z(AB) = (ZA)B$ and $(AB)Z = A(BZ)$.
I get that, but still how would it explain applying row operation to only A rather than the product AB?

We start with X = AB and apply a row operation to both sides. Clearly, the matrix resulting from the product AB should have the row operation applied to.

vcsharp2003 said:
I get that, but still how would it explain applying row operation to only A rather than the product AB?

We start with X = AB and apply a row operation to both sides. Clearly, the matrix resulting from the product AB should have the row operation applied to.

Applying a row operation is the equivalent of left multiplication by a matrix. Matrix multiplication is associative. Therefore, applying a row operation to the product $AB$ is the same as applying it to $A$ and then right multiplying the result by $B$.

FactChecker and vcsharp2003
pasmith said:
Applying a row operation is the equivalent of right multiplication by a matrix.
I see. I wasn't aware of this and also we've not been taught this. Do you have some link explaining this concept?

vcsharp2003 said:
I see. I wasn't aware of this and also we've not been taught this. Do you have some link explaining this concept?

The image in your initial post shows an example of applying a row operation by left multiplication: $R_2 \to R_2 - 2R_1$ is the same as left multiplication by $\begin{pmatrix} 1 & 0 \\ -2 & 1 \end{pmatrix}$.

vcsharp2003
pasmith said:
Therefore, applying a row operation to the product AB is the same as applying it to A and then right multiplying the result by B.
Is there a theorem in Matrices that states the above? We've not been taught anything related to this, yet we are supposed to know how a row operation is applied to both sides of a matrix equation X = AB.

vcsharp2003 said:
Is there a theorem in Matrices that states the above?
Yes: associativity. To spell it out: if ## X = AB ## then ## RX = R(AB) = (RA)B ##. It is only left to show that each of the row operations is described by a matrix ## R ##; have you tried this?

vcsharp2003
pbuk said:
Yes: associativity. To spell it out: if ## X = AB ## then ## RX = R(AB) = (RA)B ##. It is only left to show that each of the row operations is described by a matrix ## R ##; have you tried this?
No, I haven't tried it.

This is all new to me even though we're supposed to know the facts stated in original post. First, I need to get my head around what's been mentioned by you and @pasmith, and then try to connect the dots.

pasmith said:
Applying a row operation is the equivalent of left multiplication by a matrix.
This is what I'm understanding from above statement. If we have a matrix A to which we apply a row operation, then we end up with another matrix B.

Now, because of the quoted statement, we can say that B = CA, so that left multiplying the original matrix A by some appropriate matrix C will yield the transformed matrix B.

I am not sure if this is what you meant.

vcsharp2003 said:
This is all new to me even though we're supposed to know the facts stated in original post.
Are you sure? In my experience this subject is often taught by first introducing row operations and showing what you can do with them (solve simultaneous equations) and once the student is familiar with this moving on to demonstrate why it works (perhaps anticipating that some students will benefit from having worked it out for themselves on the way).

Unfortunately this doesn't seem to be working for you because your first reaction was "what the book is telling me to do can't possibly work because I know W, X and Y". Stand back a little, and let the book teach you Z.

pbuk said:
Yes: associativity. To spell it out: if ## X = AB ## then ## RX = R(AB) = (RA)B ##. It is only left to show that each of the row operations is described by a matrix ## R ##; have you tried this?
Do you know of a link at which this exact thing is proved or explained in detail?

vcsharp2003 said:
Do you know of a link at which this exact thing is proved or explained in detail?
What, associativity of matrix multiplication? I would expect this to be covered any Linear Algebra text book (although the details of the proof may be omitted as they are rather tedious jumble of subscripts) - what are you using?

If you mean working out the matrix for each row operation then you should be able to do this yourself. I suggest you start with scalar multiplication.

pbuk said:
What, associativity of matrix multiplication?
No, I'm very clear about what associative property means for matrices and of course, we've been taught this. What I was asking was any theorem about there being a matrix R for every row operation i.e. "It is only left to show that each of the row operations is described by a matrix".

As you suggested, I'll try to prove that. May be start with expressing the row transformation as a product of two matrices by using a specific example.

At the risk of repeating some of what has already been said, can I add this…

Elementary row operations are:
- swapping rows;
- multiplying a row by a (non-zero) scalar;
- adding multiples of one row to another.

For example, swapping the rows of a 2xn matrix can be achieved by left multiplying by ##\begin{bmatrix}
0&1\\
1&0\\
\end{bmatrix}##.

Every row operation on a matrix can be achieved by left-multiplying the matrix by some other matrix (##L##). So if ##X = AB## then:
##LX = L(AB) = (LA)B## (associativity)
The same row operation is performed on ##X## and on only ##A##.

Similarly, we can have elementary column operations on a matrix; each corresponds to right-multiplying the matrix by some other matrix (##R##).
##XR = (AB)R = A(BR)##

Another way to get an insight into ‘what’s really happening’ is to do it explicitly for some simple cases, e.g.

##A = \begin{bmatrix}
a&b\\
c&d\\
\end{bmatrix}
~~~B = \begin{bmatrix}
p&q\\
r&s\\
\end{bmatrix}
~~~X = AB =
\begin{bmatrix}
ap+br&aq+bs\\
cp+dr&cq+ds\\
\end{bmatrix}##

Look at what happens if we swap ##A##'s rows:

##A_{swapped} =
\begin{bmatrix}
c&d\\
a&b\\
\end{bmatrix}##

##A_{swapped} B =
\begin{bmatrix}
cp+dr&cq+ds\\
ap+br&aq+bs\\
\end{bmatrix}## which is the same as ##X_{swapped}##.

We’ve simply swapped symbols: ##a↔c## and ##b↔d##. ##B## did not change.

It’s worth (IMO) doing other examples for yourself for the other types of row operations (and for column operations if you want).

Minor edit.

vcsharp2003 and SammyS
Steve4Physics said:
At the risk of repeating some of what has already been said, can I add this…

Elementary row operations are:
- swapping rows;
- multiplying a row by a (non-zero) scalar;
- adding multiples of one row to another.

For example, swapping the rows of a 2xn matrix can be achieved by left multiplying by ##\begin{bmatrix}
0&1\\
1&0\\
\end{bmatrix}##.

Every row operation on a matrix can be achieved by left-multiplying the matrix by some other matrix (##L##). So if ##X = AB## then:
##LX = L(AB) = (LA)B## (associativity)
The same row operation is performed on ##X## and on only ##A##.

Similarly, we can have elementary column operations on a matrix; each corresponds to right-multiplying the matrix by some other matrix (##R##).
##XR = (AB)R = A(BR)##

Another way to get an insight into ‘what’s really happening’ is to do it explicitly for some simple cases, e.g.

##A = \begin{bmatrix}
a&b\\
c&d\\
\end{bmatrix}
~~~B = \begin{bmatrix}
p&q\\
r&s\\
\end{bmatrix}
~~~X = AB =
\begin{bmatrix}
ap+br&aq+bs\\
cp+dr&cq+ds\\
\end{bmatrix}##

Look at what happens if we swap ##A##'s rows:

##A_{swapped} =
\begin{bmatrix}
c&d\\
a&b\\
\end{bmatrix}##

##A_{swapped} B =
\begin{bmatrix}
cp+dr&cq+ds\\
ap+br&aq+bs\\
\end{bmatrix}## which is the same as ##X_{swapped}##.

We’ve simply swapped symbols: ##a↔c## and ##b↔d##. ##B## did not change.

It’s worth (IMO) doing other examples for yourself for the other types of row operations (and for column operations if you want).

Minor edit.
Thank you. It makes a lot of sense now.

I will try to figure out the matrix L for the example I gave in my original post.

Steve4Physics
IMO, your book has almost certainly gone over the basic properties of matrix multiplication by the time it talks about order and right versus left multiplication. Go back to the section where the book says that matrix multiplication is not commutative and see if it says that matrix multiplication is associative.
For small 2x2 dimensions, it is easy to show by hand that (AB)C = A(BC).

FactChecker said:
IMO, your book has almost certainly gone over the basic properties of matrix multiplication by the time it talks about order and right versus left multiplication. Go back to the section where the book says that matrix multiplication is not commutative and see if it says that matrix multiplication is associative.
For small 2x2 dimensions, it is easy to show by hand that (AB)C = A(BC).
Associative property is given in the textbook and I do understand that. What was not given in our book and which I'm still trying to understand is why every row operation is equivalent to left multiplication by a certain matrix. Honestly, I have no clue why that part is true and it's not mentioned in our book. That's why I came here, else I would read it in my textbook.

FactChecker
Consider X = AB and go through the multiplication across the elements of the first row of A times the elements down the rows of B. You see that you are forming the first row of X from a linear combination of the rows of B times the elements across the first row of A. So A is telling you how to form the rows of X by taking linear combinations of the rows of B.
Simple example: ##A = \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ \end{bmatrix}##:
The first row of X will be 0 times the first row of B plus 1 times the second row of B.
The second row of X will be 1 times the first row of B plus 0 times the second row of B.
So X will have the first and second rows of B swapped.

Last edited:
vcsharp2003
vcsharp2003 said:
Associative property is given in the textbook and I do understand that. What was not given in our book and which I'm still trying to understand is why every row operation is equivalent to left multiplication by a certain matrix. Honestly, I have no clue why that part is true and it's not mentioned in our book. That's why I came here, else I would read it in my textbook.
A person already stated why. Ie., row operations on a matrix can be characterized by left multiplication by elementary matrices.

column operations are defined by right multiplication by elementary matrices.

Do you know what a general elementary matrix is?

What book are you using? I have never seen a LA book not mention this, and I have plenty. I am curious.

vcsharp2003 said:
What was not given in our book and which I'm still trying to understand is why every row operation is equivalent to left multiplication by a certain matrix.
A matrix which perform an elementary row (or, less commonly, column) operation is called an ‘elementary matrix’. You can easily look this up.

It is always possible (and easy) to construct a required elementary matrix.

Suppose ##A## is an m x n matrix and you want to perform a single row operation on it.

1. Take the m x m identity matrix.

2. Apply the required row operation to the identity matrix.

You now have the required elementary matrix (##E##). Magic! The product ##EA## produces the required row operation on ##A##.

If you need to perform another row operation, you must repeat steps 1 and 2.

For example, suppose ##A## is a 3xn matrix and you want to perform the row operation ##7R_1+R_3 →R_3##.

1&0&0&\\
0&1&0&\\
0&0&1&\\
\end{bmatrix}##

Perform the required row operation on this identity matrix. This gives ##E = \begin{bmatrix}
1&0&0&\\
0&1&0&\\
7&0&1&\\
\end{bmatrix}##

If you can’t see why this works, just make up a simple example and work through the matrix multiplication in detail.

Once you understand, it should be clear "why every row operation is equivalent to left multiplication by a certain matrix". "

vcsharp2003
MidgetDwarf said:
Do you know what a general elementary matrix is?
No, I don't know it. And there is no mention of what an elementary matrix is. It has a section on types of matrices under the Matrices chapter, but there is nothing like elementary matrix discussed.

Matrices is just one chapter in our textbook for grade 12, in which there are multiple other chapters like Relations and Functions, Inverse Trigonometric Functions, Determinants, Continuity and Differentiability, Application of Derivatives.

MidgetDwarf said:
A person already stated why. Ie., row operations on a matrix can be characterized by left multiplication by elementary matrices.
I meant if there was a theorem that states for every row operation we can use a matrix and left multiply to get the transformed matrix ( which would then have a general proof). Or may be there is no theorem stating this, but it's true always due to some other facts.

Steve4Physics said:
A matrix which perform an elementary row (or, less commonly, column) operation is called an ‘elementary matrix’. You can easily look this up.

It is always possible (and easy) to construct a required elementary matrix.

Suppose ##A## is an m x n matrix and you want to perform a single row operation on it.

1. Take the m x m identity matrix.

2. Apply the required row operation to the identity matrix.

You now have the required elementary matrix (##E##). Magic! The product ##EA## produces the required row operation on ##A##.

If you need to perform another row operation, you must repeat steps 1 and 2.

For example, suppose ##A## is a 3xn matrix and you want to perform the row operation ##7R_1+R_3 →R_3##.

1&0&0&\\
0&1&0&\\
0&0&1&\\
\end{bmatrix}##

Perform the required row operation on this identity matrix. This gives ##E = \begin{bmatrix}
1&0&0&\\
0&1&0&\\
7&0&1&\\
\end{bmatrix}##

If you can’t see why this works, just make up a simple example and work through the matrix multiplication in detail.

Once you understand, it should be clear "why every row operation is equivalent to left multiplication by a certain matrix". "
Thanks once again. It's becoming more clear now. There was no concept of elementary matrix discussed in the chapter of Matrices, though identity matrix, diagonal matrix and scalar matrix were discussed in this chapter. But the way you have explained it makes it amply clear.

Probably, there is no theorem stating that an elementary matrix exists for every row or column operation that we can left or right multiply with the untransformed matrix to get the transformed matrix. It's just true by observation.

Last edited:
vcsharp2003 said:
No, I don't know it. And there is no mention of what an elementary matrix is. It has a section on types of matrices under the Matrices chapter, but there is nothing like elementary matrix discussed.

Matrices is just one chapter in our textbook for grade 12, in which there are multiple other chapters like Relations and Functions, Inverse Trigonometric Functions, Determinants, Continuity and Differentiability, Application of Derivatives.
ahh. the reason why, is that the book you are reading is not a linear algebra book. I am assuming a pre-calculus book. so at this point, you are just supossed to "shut up and calculate." The reason why you are learning about row operations in your class, is to learn RREF and use this as an alternative to finding points of intersections of lines, instead of using the substitution or plus/minus (forget the name) method, or answering word problems involving 2 or more variables.Since this is not an acceptable answer to you, then the only alternative is to actual purchase and work through a linear algebra book.

The simplest book linear algebra book is the one by Paul Shields.
A standard book, probably one of the easier reads, is the one by Anton. Any edition of Anton will do.

vcsharp2003
vcsharp2003 said:
Matrices is just one chapter in our textbook for grade 12, in which there are multiple other chapters like Relations and Functions, Inverse Trigonometric Functions, Determinants, Continuity and Differentiability, Application of Derivatives.