- #1

Bashyboy

- 1,421

- 5

## Homework Statement

Let ##f(x) = (x-a_1)...(x-a_n) \in k[x]##, where ##k## is a field. Show that ##f(x)## has no repeated roots (i.e., all the ai are distinct elements in ##k##) if and only if ##gcd(f,f')=1##, where ##f'(x)## is the derivative of ##f##

## Homework Equations

##(x-a)^2 |f(x)## implies ##(x-a)|f'(x)##

##(x-a)|f(x)## and ##(x-a)|f'(x)## implies ##(x-a)^2|f(x)##

## The Attempt at a Solution

First note that ##f(x) = (x-a_1)...(x-a_n)## has a repeated roots if and only if ##(x-a_k)^p## is a factor of ##f(x)## for some ##k \in \{1,...,n\}## and ##p \ge 2##.

Suppose that ##f(x)## has no repeated. Note that ##f(x) = (x-a_1)...(x-a_n)## is the prime factorization of ##f(x)##. Now if were the case that ##gcd(f,f') \neq 1##, then both ##f## and ##f'## would have a common prime factor. Since we know what ##f(x)##'s prime factors look like, we know there is a ##k## such that ##x-a_k## divides ##f'##. But the second theorem cited above implies that ##(x-a_k)^2## divides ##f(x)## and therefore it has a repeated root. Hence, ##gcd(f,f')## must be ##1##.

Now suppose that ##(f,f')=1##. If ##f(x)## had a repeated root, then ##(x-a_k)^2## would divide it, for some ##k##. But the first theorem cited above would imply ##(x-a)|f'(x)##, contradicting the fact that ##(f,f')=1##.

How does this sound?

Last edited: