Integral bases and Discriminants

  • Thread starter Thread starter Firepanda
  • Start date Start date
  • Tags Tags
    Bases Integral
Firepanda
Messages
425
Reaction score
0
dz86e.jpg


Here is my solution, I think I have part i) done OK, but I'm not sure about how to proceed with part ii).

I suppose I need to show that both determinants of the base change matrices Cij and Dij are = ±1?Thanks
 
Last edited:
Physics news on Phys.org
You pretty much have it... Just combine both your equations into
\Delta(\alpha_1,\ldots,\alpha_n) = (\det D_{ij})^2 (\det C_{ij})^2 \Delta(\alpha_1,\ldots,\alpha_n).
What can you conclude from this?
 
morphism said:
You pretty much have it... Just combine both your equations into
\Delta(\alpha_1,\ldots,\alpha_n) = (\det D_{ij})^2 (\det C_{ij})^2 \Delta(\alpha_1,\ldots,\alpha_n).
What can you conclude from this?

that (\det D_{ij})^2 (\det C_{ij})^2 = 1?

so the determinants are either ±1, so we can conclude the statement?
 
Yes, because one discriminant is (det)^2 times the other!
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top