fresh_42 said:
It is not sufficient and not really obvious. If you quote Schur's Lemma, you must find a formulation which fits here and why it is applicable. E.g. some books generally assume generally ##\operatorname{char}\mathbb{F}=0## or algebraic closure, or irreducibility of the action, so it has to be checked, whether a quotation is applicable.
I think, and what I've read you as well, we have to consider ##g.\alpha = \alpha\, , \,g.\beta=\beta\, , \,etc.## as one-dimensional spaces, i.e. not elementwise but subspacewise. I'm afraid you will have to solve ##\begin{bmatrix}a&b\\c&d\end{bmatrix}\cdot \begin{bmatrix}1\\0 \end{bmatrix} =\lambda \begin{bmatrix}1\\0\end{bmatrix}## and so on for every direction.
Suppose that ##A\in\ker (\varphi)##. Let ##A = \begin{bmatrix}a&b\\c&d\end{bmatrix}##. Then by applying ##A## to each of the 4 subspaces and insisting that the subspaces be fixed, we find that
$$
\begin{align*}
\left\{\begin{bmatrix}a\\c \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-a\\-c \end{bmatrix}\right\} &= \left\{\begin{bmatrix}1\\0 \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-1\\0 \end{bmatrix}\right\}\\
\left\{\begin{bmatrix}b\\d \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-b\\-d \end{bmatrix}\right\} &= \left\{\begin{bmatrix}0\\1 \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}0\\-1 \end{bmatrix}\right\}\\
\left\{\begin{bmatrix}a+b\\c+d \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-a-b\\-c-d \end{bmatrix}\right\} &= \left\{\begin{bmatrix}1\\1 \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-1\\-1 \end{bmatrix}\right\}\\
\left\{\begin{bmatrix}a-b\\c-d \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-a+b\\-c+d \end{bmatrix}\right\} &= \left\{\begin{bmatrix}1\\-1 \end{bmatrix}, \begin{bmatrix}0\\0 \end{bmatrix}, \begin{bmatrix}-1\\1 \end{bmatrix}\right\}\\
\end{align*}
$$
The first condition implies that ##c=0,a=\pm 1##. The second condition implies that ##b=0,d=\pm 1##. The third condition implies that ##a=1\text{ and }d=1## or ##a=-1\text{ and }d=-1##, which is also what the fourth condition implies.
So we see that the only values that satisfy all 4 cases is when ##a=1, d=1,b=0,c=0## or ##a=-1,d=-1,b=0,c=0##, and this corresponds to the identity matrix and its additive inverse.