How does row operation on an augmented matrix result in the inverse of a matrix?

  • Thread starter Thread starter rajeshmarndi
  • Start date Start date
  • Tags Tags
    Explanation Matrix
Click For Summary
Row operations on an augmented matrix can be used to find the inverse of a matrix by transforming the original matrix into the identity matrix while simultaneously applying the same operations to the identity matrix. This process effectively reveals the inverse matrix as the result of the operations applied to the identity. Each row operation corresponds to an elementary matrix, and the sequence of operations that reduces the original matrix to the identity also yields the inverse. Understanding this relationship clarifies how the inverse works in both directions, as multiplying a matrix by its inverse results in the identity matrix. The discussion emphasizes the importance of grasping the underlying algebraic principles behind these operations.
rajeshmarndi
Messages
319
Reaction score
0
I just couldn't understand how does augmented matrix deduce inverse of a matrix. I mean what is it in the row operation because of which we get the inverse of a matrix. I just don't want to learn the steps but to understand why it works.

Thank you.
 
Mathematics news on Phys.org
I am not certain what you mean by "augmented matrix". I think of the augmented matrix as meaning that, in order to solve, say the equations ax+ by= c, dx+ ey= f, which we could write as the matrix equation \begin{bmatrix}a & b \\ d & e \end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}= \begin{bmatrix}c \\ f\end{bmatrix}. To solve that we "augment" the matrix \begin{bmatrix}a & b \\ d & e \end{bmatrix} by the single column \begin{bmatrix}c \\ d\end{bmatrix} to get \begin{bmatrix}a & b & c \\ d & e & f\end{bmatrix}.

But that has nothing, directly, to do with finding the inverse. I think you are talking about a related problem- "augment" the matrix \begin{bmatrix} a & b \\ c & d \end{bmatrix} with the indentity matrix to get \begin{bmatrix}a & b & 1 & 0 \\ c & d & 0 & 1\end{bmatrix}. Then do whatever "row-operations" are necessary to reduce the first two columns to the identity while also applying them to the last two columnst to get \begin{bmatrix}1 & 0 & p & q \\ 0 & 1 & r & s\end{bmatrix} and the new matrix \begin{bmatrix}p & q \\ r & s\end{bmatrix} is the inverse matrix to the first matrix.

That works because every "row operation" corresponds to an "elementary" matrix- the matrix we get by applying that particular row operation to the identity matrix. One row operation is "add a multiple of one row to another". For example, "add -2 times the first row to the second row". If we do that to the identity matrix \begin{bmatrix} 1 & 0 \\ 0 & 1\end{bmatrix} we get \begin{bmatrix} 1 & 0 \\ -2 & 1 \end{bmatrix}. And look what happens when we multiply a matrix by that (on the left): \begin{bmatrix}1 & 0 \\ -2 & 1\end{bmatrix}\begin{bmatrix}a & b \\ c & d \end{bmatrix}= \begin{bmatrix}a & b \\ c- 2a & d- 2b\end{bmatrix}. That is precisely the result of applying the row operation to that matrix!

Now, suppose we have a matrix, A, and there is a sequence of row operations, R_1, R_2, \cdot\cdot\cdot, R_n, in that order, that reduce A to the identity. Those row-operations correspond to a sequence of elementary operations, E_1, E_2, \cdot\cdot\cdot, E_n such that E_nE_{n-1}\cdot\cdot\cdot E_2E_1A= (E_nE_{n-1}\cdot\cdot\cdot E_2E_1)A= I which means that E_nE_{n-1}\cdot\cdot\cdot E_2E_1 is the inverse matrix to A. But, of course E_nE_{n-1}\cdot\cdot\cdot E_2E_1 times I is the same as E_nE_{n-1}\cdot\cdot\cdot E_2E_1 and, again, those multiplications are the same thing as applying the row operations to the identity matrix so:
Applying the same series of row operations that make A the identity matrix to the identity matrix results in the inverse matrix to A.

Writing the matrix A and the identity matrix side by side and then row-reducing both is a convenient way of applying the same row operations to A and the indentity matrix, reducing A to the identity matrix and changing the identity matrix to A inverse.
 
Last edited by a moderator:
HallsofIvy said:
Writing the matrix A and the identity matrix side by side and then row-reducing both is a convenient way of applying the same row operations to A and the indentity matrix, reducing A to the identity matrix and changing the identity matrix to A inverse.
Thanks I do now understand how the row operation works i.e [A-1] [A] = identity matrix. But still couldn't understand how the other way works i.e [A] [A-1]= identity matrix. Seems really difficult, the former was easy to comprehend.
 
rajeshmarndi said:
Thanks I do now understand how the row operation works i.e [A-1] [A] = identity matrix. But still couldn't understand how the other way works i.e [A] [A-1]= identity matrix. Seems really difficult, the former was easy to comprehend.

There are two theorems to help.

The first one is that if you have a square matrix ##A## and it is left invertible (i.e. have a left inverse ##B##, ##BA=I##) then it is invertible.
And the second one is that if a matrix ##A## is invertible, then its left inverse ##B## is unique and is also a right inverse, i.e. if ##BA=I## then ##AB=I##; the same of course goes for the right inverse.
 
  • Like
Likes micromass
I didn't understand what is the relation of the coefficient of the equations and their row operation(inverse) to get identity matrix.
 
rajeshmarndi said:
I didn't understand what is the relation of the coefficient of the equations and their row operation(inverse) to get identity matrix.
I'm not certain what your question is. but perhaps this will help. If we have the two equations ax+ by= e, cx+ dy= f we can write that as the matrix equation
\begin{bmatrix}a & b \\ c & d \end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}= \begin{bmatrix}e \\ f\end{bmatrix}
Do you understand that part? That is essentially saying that "Ax= b" where A, x, and b are those three matrices. One way to solve that equation is to find the multiplicative inverse of A, A^{-1}, and multiply both sides of the equation by that inverse: A^{1}Ax= x= A^{-1}b

The "augmented matrix" is just a short way of writing that matrix equation.
\begin{bmatrix}a & b & e \\ c & d & f\end{bmatrix}

Now, "row-operations" are one way to find the inverse of a matrix. The key is that every row operation corresponds to multiplying by a matrix- and we can get that matrix by applying the row operation to the identity matrix. In terms of 3 by 3 matrices, the row operation "add three times the second row to the first row" is
\begin{bmatrix} 1 & 3 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}
See what happens when we multiply that matrix by a general matrix:
\begin{bmatrix}1 & 3 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1\end{bmatrix}\begin{bmatrix}a & b & c \\ d & e & f \\ g & h & i\end{bmatrix}= \begin{bmatrix}a+ 3d & b+ 3e & c+ 3f \\ d & e & f \\ g & h & i\end{bmatrix}
3 times the second row has been added to the first row!

So when you apply a series of row operations that reduce matrix A to the identity matrix that is the same as multiplying a series of matrices whose product is A^{-1}. When you apply those same row operations to the last column of the augmented matrix, the "b" in Ax+ b, you are multiplying b by A^{-1}
 
HallsofIvy said:
I'm not certain what your question is. but perhaps this will help.
I do understand the meaning of A-1 A = identity matrix i.e row operation just applied to A.

But my question is, why and how we get the identity matrix , also when A A-1 = identity matrix?
 
rajeshmarndi said:
I do understand the meaning of A-1 A = identity matrix i.e row operation just applied to A.

But my question is, why and how we get the identity matrix , also when A A-1 = identity matrix?
Not so much because of LINEAR ALGEBRA, but more important is the simple Algebra concept of INVERSE. Start with a entity and do an operation on it using another but different identity, and a result entity is the result. How to you take this result entity and operate on it with an entity which gives you back the entitiy originally used?

This can be done using real numbers, and this can be done with matrices.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 8 ·
Replies
8
Views
4K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K