Row operations performed on two matrices

d.vaughn
Messages
5
Reaction score
0
if you perform row operations on a matrix A to convert it to the identity matrix and then use the same row operations and apply it to another matrix B, why is it that the end result of B^r does not depends on B's actual sequence
 
Physics news on Phys.org
What do you mean by B's actual sequence?
 
And, what do you mean by "B^r"? Every row reduction is equivalent to an "elementary matrix"- the result of applying that row reduction to the identity matrix. Applying a given row operation to a matrix is the same as multiplying the corresponding elementary matrix. And applying row operations to A to reduce it the identity matrix means that the product of the corresponding elementary matrices is A^{-1}. Applying those row operations to B gives A^{-1}B.

That means, in particular, that if you have the matrix equation Ax= B, and apply the the row operations that reduce A to the identity matrix to B, you get x= A^{-1}B, the solution to the equation.
 
When I say Bs actual sequence, I mean the numbers that compose that matrix such as a 3x3 matrix with the numbers 654,896,327 and when I say Br I mean performing the exact same row operations that you did on A and applying them to B in the same order and I want to know why it doesn't matter what the actual sequence of B is as long as you're performing the same row operations on it as you did with another matrix, A
 
I guess the short answer is that the result you get does depend on the entries of B in exactly the way that HallsofIvy explained.

What doesn't matter I guess is the exact sequence of steps you took to row reduce A. As long as you do row operations that eventually reduce A to the identity, the result of all those row operations combines to be the same operation. When you apply that operation on B, you'll always get the matrix A^(-1)B.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top