What mathematical algebra explains sequence of circular shifts on rows and columns of a matrix? Consider a simple matrix (3X3) with entries thus: [1 2 3; 4 5 6; 7 8 9;] Circular shifts can be performed on any row or any column thus: row-(1/2/3)-(right/left) and column-(1/2/3)-(up/dn) Examples: R1-right transforms [1 2 3] to [3 1 2]. R3-left makes [7 8 9], [8 9 7]. Also: C1-up converts [1 4 7] to [4 7 1]. Now these moves can be performed repeatedly on the initial matrix. Assume a sequence of moves thus: R1-r, C1-up, R3-left, C2-dn, R2-r, C3-up This converts the initial matrix to this matrix (you can work this to confirm): [4 9 1; 6 7 3; 8 5 2]. Now assume that you know NONE of these moves? Given simply the 2 matrices: the initial and the final, what method shall help me find the moves that lead from the initial to final, or final to initial. The latter shall be an inverted sequence of each former moves' reverse. What algebra goes here in? Please point to any and every resource like group theory, number theory, permutation theory, sequential circuits etc... This matrix 'jumbling' if you say so, inspired in part from the Rubik's Cube finds some interesting applications in cryptography and data transformation. I look forward to your comments.