The answer is: you CAN, but only in certain dimensions, under certain limited circumstances.
First of all, for "division" to even make sense, you need some kind of multiplication, first. And this multiplication has to be of the form:
vector times vector = same kind of vector.
It turns out that this is only possible in certain dimensions: 1,2,4 (and if you allow certain "strangenesses" 8 and 16). This is a very "deep" theorem, due to Frobenius, and requires a bit of high-powered algebra to prove.
Now matrices only have such a multiplication when they are nxn (otherwise we get:
matrix times matrix = matrix of different size, which turns out to matter).
However, it turns out we can have "bad matrices", like so:
$AB = 0$ where neither $A$ nor $B$ are the 0-matrix. For example:
$A = \begin{bmatrix}1&0\\0&0 \end{bmatrix}$
$B = \begin{bmatrix}0&0\\0&1 \end{bmatrix}$
Now suppose, just for the sake of argument, we had a matrix we could call:
$\dfrac{1}{A}$.
Such a matrix should satisfy:
$\dfrac{1}{A}A = I$, the identity matrix.
Then:
$B = IB = \left(\dfrac{1}{A}A\right)B = \dfrac{1}{A}(AB) = \dfrac{1}{A}0 = 0$
which is a contradiction, since $B \neq 0$
In other words, "dividing by such a matrix" is rather like dividing by zero, it leads to nonsense.
It turns out the the condtition:
$AB = 0, A,B \neq 0$
is equivalent to:
$Av = 0$ for some vector $v \neq 0$.
Let's see why this is important by comparing matrix multiplication with scalar multiplication:
If $rA = rB$, we have:
$\dfrac{1}{r}(rA) = \left(\dfrac{1}{r}r\right)A = 1A = A$
and also:
$\dfrac{1}{r}(rA) = \dfrac{1}{r}(rB) = \left(\dfrac{1}{r}r\right)B = 1B = B$
provided $r \neq 0$ (which is almost every scalar).
This allows us to conclude $A = B$, in other words, the assignment:
$A \to rA$ is one-to-one.
However, if we take matrices:
$RA = RB$ does NOT imply $A = B$, for example let
$R = \begin{bmatrix} 1&0\\0&0 \end{bmatrix}$
$A = \begin{bmatrix} 0&0\\0&1 \end{bmatrix}$
$B = \begin{bmatrix} 0&0\\0&2 \end{bmatrix}$
Then we see, $RA = RB = 0$, but clearly $A$ and $B$ are different matrices.
So "left-multiplication by a matrix" is no longer 1-1, we means we can't uniquely "undo" it (which is what, at its heart, "division" is: the "un-doing" of multiplication).
I hope this made sense to you.