MHB Why Is There No Division for Vectors?

  • Thread starter Thread starter find_the_fun
  • Start date Start date
  • Tags Tags
    Vectors
find_the_fun
Messages
147
Reaction score
0
I was reading my computer graphics textbook and under frequently asked questions one was "why is there no vector division?" and it said "it turns out there is no 'nice' way to divide vectors". That's not a very good explanation. Why is it that matrices can't be divided?
 
Physics news on Phys.org
You can sort of divide square matrices. Suppose you have $Ax=b$, where $A$ is a matrix and $x,b$ are vectors. Then you left multiply both sides by $A^{-1}$ (assuming it exists, which it will so long as $\det(A)\not=0$). Then you get $A^{-1}Ax=A^{-1}b$, and since $A^{-1}A=I$, then you have $Ix=A^{-1}b$, or $x=A^{-1}b$. It is sometimes possible (though not the most efficient method) to solve a linear system this way.

However, this sort of inverse only works with square matrices, because you need $A^{-1}A$ to be the right size. Since a vector (aside from 1 x 1 matrices, also known as numbers) is not a square matrix, you cannot do this kind of inversion. The key here is that you're trying to achieve some sort of multiplicative identity, $I$ in this case. You can't do that with non-square matrices like vectors.
 
The answer is: you CAN, but only in certain dimensions, under certain limited circumstances.

First of all, for "division" to even make sense, you need some kind of multiplication, first. And this multiplication has to be of the form:

vector times vector = same kind of vector.

It turns out that this is only possible in certain dimensions: 1,2,4 (and if you allow certain "strangenesses" 8 and 16). This is a very "deep" theorem, due to Frobenius, and requires a bit of high-powered algebra to prove.

Now matrices only have such a multiplication when they are nxn (otherwise we get:

matrix times matrix = matrix of different size, which turns out to matter).

However, it turns out we can have "bad matrices", like so:

$AB = 0$ where neither $A$ nor $B$ are the 0-matrix. For example:

$A = \begin{bmatrix}1&0\\0&0 \end{bmatrix}$

$B = \begin{bmatrix}0&0\\0&1 \end{bmatrix}$

Now suppose, just for the sake of argument, we had a matrix we could call:

$\dfrac{1}{A}$.

Such a matrix should satisfy:

$\dfrac{1}{A}A = I$, the identity matrix.

Then:

$B = IB = \left(\dfrac{1}{A}A\right)B = \dfrac{1}{A}(AB) = \dfrac{1}{A}0 = 0$

which is a contradiction, since $B \neq 0$

In other words, "dividing by such a matrix" is rather like dividing by zero, it leads to nonsense.

It turns out the the condtition:

$AB = 0, A,B \neq 0$

is equivalent to:

$Av = 0$ for some vector $v \neq 0$.

Let's see why this is important by comparing matrix multiplication with scalar multiplication:

If $rA = rB$, we have:

$\dfrac{1}{r}(rA) = \left(\dfrac{1}{r}r\right)A = 1A = A$

and also:

$\dfrac{1}{r}(rA) = \dfrac{1}{r}(rB) = \left(\dfrac{1}{r}r\right)B = 1B = B$

provided $r \neq 0$ (which is almost every scalar).

This allows us to conclude $A = B$, in other words, the assignment:

$A \to rA$ is one-to-one.

However, if we take matrices:

$RA = RB$ does NOT imply $A = B$, for example let

$R = \begin{bmatrix} 1&0\\0&0 \end{bmatrix}$

$A = \begin{bmatrix} 0&0\\0&1 \end{bmatrix}$

$B = \begin{bmatrix} 0&0\\0&2 \end{bmatrix}$

Then we see, $RA = RB = 0$, but clearly $A$ and $B$ are different matrices.

So "left-multiplication by a matrix" is no longer 1-1, we means we can't uniquely "undo" it (which is what, at its heart, "division" is: the "un-doing" of multiplication).

I hope this made sense to you.
 
Deveno said:
The answer is: you CAN, but only in certain dimensions, under certain limited circumstances.

...

vector times vector = same kind of vector.

It turns out that this is only possible in certain dimensions: 1,2,4 (and if you allow certain "strangenesses" 8 and 16).

...

In other words, "dividing by such a matrix" is rather like dividing by zero, it leads to nonsense.

...

So "left-multiplication by a matrix" is no longer 1-1, we means we can't uniquely "undo" it (which is what, at its heart, "division" is: the "un-doing" of multiplication).
And despite all this, one can divide by almost all square matrices of any dimension, and not just having 1, 2, 4, 8 or 16 components.
 
There's a programming language called APL, which was designed for applying math.

Regular division is denoted a÷b.
The same operator is used for the reciprocal: ÷b means 1/b.

Typically operations for matrices are denoted with a square block around the operator.
In particular the matrix inverse is ⌹B.
And matrix division is: A⌹B. This means A multiplied by the inverse of B.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top