Math Amateur
Gold Member
MHB
- 3,920
- 48
I am reading Nicholson: Introduction to Abstract Algebra, Section 6.3 Splitting Fields.
Example 1 reads as follows: (see attachment)
--------------------------------------------------------------------------------------------------
Example 1. Find an extension [tex]E \supseteq \mathbb{Z}_2[/tex] in which [tex]f(x) = x^3 + x + 1[/tex] factors completely into linear factors.
--------------------------------------------------------------------------------------------------
The solution reads as follows:
-------------------------------------------------------------------------------------------------
Solution. The polynomial f(x) is irreducible over [tex]\mathbb{Z}_2[/tex] (it has no root in [tex]\mathbb{Z}_2[/tex] ) so
[tex]E = \{ a_0 + a_1 t + a_2 t^2 \ | \ a_i \in \mathbb{Z}_2 , f(t) = 0 \}[/tex]
is a field containing a root t of f(x).
Hence x + t = x - t is a factor of f(x)
The division algorithm gives [tex]f(x) = (x+t) g(x)[/tex] where [tex]g(x) = x^2 + tx + (1 + t^2)[/tex]
, so it suffices to show that g(x) also factors completely in E.
Trial and error give [tex]g(t^2) = 0[/tex] so [tex]g(x) = (x + t^2)(x + v)[/tex] for some [tex]v \in F[/tex].
... ... etc (see attachment)
-------------------------------------------------------------------------------------------------------------
My problem is that I cannot show how [tex]g(t^2) = 0[/tex] implies that [tex]g(x) = (x + t^2)(x + v)[/tex] for some [tex]v \in F[/tex].
I would appreciate some help.
Peter
[Note; This has also been posted on MHF]
Example 1 reads as follows: (see attachment)
--------------------------------------------------------------------------------------------------
Example 1. Find an extension [tex]E \supseteq \mathbb{Z}_2[/tex] in which [tex]f(x) = x^3 + x + 1[/tex] factors completely into linear factors.
--------------------------------------------------------------------------------------------------
The solution reads as follows:
-------------------------------------------------------------------------------------------------
Solution. The polynomial f(x) is irreducible over [tex]\mathbb{Z}_2[/tex] (it has no root in [tex]\mathbb{Z}_2[/tex] ) so
[tex]E = \{ a_0 + a_1 t + a_2 t^2 \ | \ a_i \in \mathbb{Z}_2 , f(t) = 0 \}[/tex]
is a field containing a root t of f(x).
Hence x + t = x - t is a factor of f(x)
The division algorithm gives [tex]f(x) = (x+t) g(x)[/tex] where [tex]g(x) = x^2 + tx + (1 + t^2)[/tex]
, so it suffices to show that g(x) also factors completely in E.
Trial and error give [tex]g(t^2) = 0[/tex] so [tex]g(x) = (x + t^2)(x + v)[/tex] for some [tex]v \in F[/tex].
... ... etc (see attachment)
-------------------------------------------------------------------------------------------------------------
My problem is that I cannot show how [tex]g(t^2) = 0[/tex] implies that [tex]g(x) = (x + t^2)(x + v)[/tex] for some [tex]v \in F[/tex].
I would appreciate some help.
Peter
[Note; This has also been posted on MHF]