Proving f(x) Divides g(x) iff g(x) in <f(x)>

  • Thread starter Thread starter tinynerdi
  • Start date Start date
tinynerdi
Messages
9
Reaction score
0

Homework Statement


let F be a field and f(x),g(x) in F[x]. Show that f(x) divides g(x) if and only if g(x) in <f(x)>


Homework Equations


let E be the field F[x]/<f(x)>


The Attempt at a Solution


<=> if f(x) divides g(x) then g(x) in <f(x)>
Proof: Suppose f(x) divides g(x)q(x). then g(x)q(x) in <f(x)>. which is maximal. Therefore <f(x)> is a prime ideal. Hence g(x)q(x) in <f(x)>. implies that either g(x) in <f(x)> giving f(x) divides g(x) or that q(x) in <f(x)> giving f(x) divides q(x). But we want that g(x) in <f(x)> giving f(x) divides g(x).

can this prove go both way if it is right?
 
Physics news on Phys.org
I think this is even simpler than you think. The condition for g(x) to divide f(x) is that there is q(x) in F[x] such that f(x)=q(x)g(x), and this is exactly the condition for g(x) to belong to the ideal generated by f(x).
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top