- #1

- 1,007

- 76

I think kernel and image gets disturbed in a matrix, though I don't know what it actually is.

Why not in determinant case?

- Thread starter Raghav Gupta
- Start date

- #1

- 1,007

- 76

I think kernel and image gets disturbed in a matrix, though I don't know what it actually is.

Why not in determinant case?

- #2

Svein

Science Advisor

- 2,129

- 687

You can - but only when you do it on the right hand side simultaneously. Assume you want to find [itex]B [/itex] such that [itex]A \cdot B = I [/itex]. You do the same row operations on [itex] A[/itex] and [itex]I [/itex]. After a while (excluding singularities etc.) you end up with [itex] U \cdot B = R[/itex], where [itex] U[/itex] is upper triangular. Now substitute back until you end up with [itex]I \cdot B = H [/itex] and you are done.How we cannot apply row and column operation simultaneously on matrix when finding its inverse by elementary transformation

- #3

- 1,007

- 76

But doesn't the kernel or image gets disturbed?

- #4

Svein

Science Advisor

- 2,129

- 687

I am not an expert on matrices so, I cannot answer that. What I know, is how to find the inverse (as detailed above). Useful facts:But doesn't the kernel or image gets disturbed?

- The matrix can be inverted if its determinant is different from 0.
- The determinant of the inverse matrix is the inverse of the determinant of the original matrix.

- #5

- 177

- 61

Row operations are equivalent to left multiplication by corresponding elementary matrices, column operations are equivalent to right multiplication. So, when you perform row operations on a (square) matrix ##A##, you get a matrix ##E## (the product of corresponding elementary matrices) such that ##EA=I##. For square matrices that means that ##E=A^{-1}##; performing the same row operations on the right hand side ##I## you get ##EI=E=A^{-1}##.

I think kernel and image gets disturbed in a matrix, though I don't know what it actually is.

Why not in determinant case?

Again, you can find the inverse performing only column operations: you get a matrix ##E## (the product of corresponding elementary matrices) such that ##AE=I##, which again means that ##E=A^{-1}##. Performing the same column operations on the right hand side ##I## you get there ##IE=E=A^{-1}##.

If you perform both row and column operations you get two matrices ##E_1## and ##E_2## such that ##E_1AE_2=I##, so in the right hand side you get ##E_1IE_2=E_1E_2##. For ##E_1E_2## to be the inverse of ##A## you need ##AE_1E_2=I## (or equivalently ##E_1E_2A =I##), and you have ##E_1AE_2=I##. Matrix multiplication is not commutative, so there is no reason for the latter identity imply the former.

"No reason" is not a formal proof that the method does not work, for the formal proof you need a counterexample. You can find it just by playing with applying row and column operations to ##2\times 2## matrices. There are also more scientific method of constructing a counterexample.

The reason that applying both row and column operations is that while ##E_1AE_2=I## does not imply that ##E_1E_2=A^{-1}## it implies that ##\operatorname{det} (E_1E_2)=\operatorname{det} A^{-1}##. Namely, $$1=\operatorname{det}(E_1AE_1) = \operatorname{det} E_1 \operatorname{det} A \operatorname{det}E_2,$$ so $$\operatorname{det}(E_1E_2) \operatorname{det}E_1\operatorname{det}E_2= (\operatorname{det}A)^{-1} =\operatorname{det}A^{-1}.$$

Finally, for invertible matrices the image is all ##\mathbb R^n## and the kernel is always trivial (i.e. ##\{\mathbf 0\}##), they cannot be "disturbed" by row/column operations, they will remain the same.

- #6

Mark44

Mentor

- 34,687

- 6,394

If you're talking about a matrix you get by applying a row operation to another matrix, the answer is no. If you start with a matrix A, and apply one of the three row operations to it to get ABut doesn't the kernel or image gets disturbed?

Edit: The image can change. See my later post in this thread.

If you're talking about applying column operations, I don't know -- I have never needed to apply column operations to reduce a matrix. However, if you swap the columns of a matrix, you are swapping the roles of the variables these columns represent.

Last edited:

- #7

- 177

- 61

That is not true. Row operations preserve kernel, column operations preserve image (column space).If you're talking about a matrix you get by applying a row operation to another matrix, the answer is no. If you start with a matrix A, and apply one of the three row operations to it to get A1, the matrices A and A1 areequivalent. They have exactly the same kernel and image.

In the case of invertible matrices, however, the image is always all ##\mathbb R^n## and the kernel is always ##\{\mathbf 0\}##, so we can say that in this case the image and kernel are "preserved" user row and column operations.

- #8

- 177

- 61

- #9

Mark44

Mentor

- 34,687

- 6,394

That is not true. Row operations preserve kernel, column operations preserve image (column space).

I mispoke. Row operations don't necessarily preserve the range, as I said. A simple example shows this:

$$A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 1 & 1\end{bmatrix}$$

Using row reduction, we get an equivalent matrix.

$$B = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0\end{bmatrix}$$

Although the dimensions of the column spaces of A and B are equal (2), they span different subspaces of R

The columns of A represent a plane in R

- #10

- 1,007

- 76

I see you have applied R3--------> R3 - R1I mispoke. Row operations don't necessarily preserve the range, as I said. A simple example shows this:

$$A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 1 & 1\end{bmatrix}$$

Using row reduction, we get an equivalent matrix.

$$B = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0\end{bmatrix}$$

But there should be 1 in row 3 column 2 or more simply B

- #11

Mark44

Mentor

- 34,687

- 6,394

No. Here's what I did: -RI see you have applied R3--------> R3 - R1

But there should be 1 in row 3 column 2 or more simply B_{32}should be equal to 1?

IOW, add -R

- #12

- 1,007

- 76

Got it. Thanks to all of you - HawkEye 18, Mark44 and Svein.No. Here's what I did: -R_{1}+ R_{3}--> R_{3}and then -R_{2}+ R_{3}--> R_{3}.

IOW, add -R_{1}to R_{3}, and then add -R_{2}to R_{3}.