# Finding the inverse matrix responsible for base change in the Z3 Group

## Homework Statement

Hey guys,
So I have the following permutations, which are a subgroup of S3:
$σ_{1}=(1)(2)(3), σ_{5}=(1,2,3), σ_{6}=(1,3,2)$
This is isomorphic to Z3, which can be written as ${1,ω,ω^{2}}$

Next, we have the basis for the subgroup of S3:
$e_{i}=e_{1},e_{2},e_{3}$

And we also have the basis for the group Z3, which is a linear combination of the basis vectors of S3:
$E_{k}=(e_{1}+e_{2}+e_{3}), (e_{1}+ωe_{2}+ω^{2}e_{3}), (e_{1} +ω^{2}e_{2}+ωe_{3})$

I have to find the matrix (and its inverse) which is responsible for the following base change:
$E_{k}=S_{jk}e_{j}$

## Homework Equations

Dont think there are any.

## The Attempt at a Solution

So I think i've found the matrix $S$, the problem is that how do I find $S^{-1}$. I get the following for $S$, even though I'm not sure its right:
http://imageshack.com/a/img401/9545/h213.jpg [Broken]

I tried using the regular matrix rules for finding the inverse (transpose of the matrix of cofactors divided by the determinant) but it doesnt seem to work.

Can you guys help me out?

Last edited by a moderator:

Related Calculus and Beyond Homework Help News on Phys.org
HallsofIvy
Homework Helper
You realize that, by definition of $\omega$, $\omega^3=1$, right? The determinant of the matrix is $3\omega(\omega- 1)$ and the "transpose of the matrix of cofactors" is
$$\begin{pmatrix}\omega(\omega- 1) & \omega(\omega- 1) & \omega(\omega- 1) \\ \omega(\omega- 1) & \omega- 1 & (\omega- 1)(\omega+ 1) \\ \omega(\omega- 1) & (\omega- 1)(\omega+ 1) & \omega- 1\end{pmatrix}$$

For example, calculating the "1,1" cofactor, we compute
$$\left|\begin{array}{cc}\omega & \omega^2 \\ \omega^2 & \omega \end{array}\right|= \omega^2- \omega^4= \omega^2- (\omega^3)\omega= \omega^2- \omega= \omega(\omega- 1)$$

Hi HallsofIvy,
I get all elements of the matrix the same as you, apart from the ones [2,3] and [3,2] - [rows,columns]. You have $ω^{2}-1$ from which you take the difference of two squares. However, isnt the place value of that element in the matrix negative? which means that you should have $-(ω^{2}-1)=(1+ω)(1-ω)$? if that's correct then I'm stuck again because it doesn't factor into the determinant anymore!