MHB R commutative ring then R[x] is never a field

  • Thread starter Thread starter Math Amateur
  • Start date Start date
  • Tags Tags
    Field Ring
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Joseph Rotman's book Advanced Modern Algebra.

I need help with Problem 2.20 on page 94.

Problem 2.20 reads as follows:

2.20. Prove that if R is a commutative ring then R[x] is never a field.

Could someone please help me get started on this problem.

Peter

***EDIT*** Presumably one shows there are polynomials that do not have inverses??/
 
Last edited:
Physics news on Phys.org
Peter said:
I am reading Joseph Rotman's book Advanced Modern Algebra.

I need help with Problem 2.20 on age 94.

Problem 2.20 reads as follows:

2.20. Prove that if R is a commutative ring then R[x] is never a field.

Could someone please help me get started on this problem.

Peter

***EDIT*** Presumably one shows there are polynomials that do not have inverses??/
I have been reflecting on my own post ... and two points come to mind regarding Rotman's Problem 2.20.

Firstly since R is a commutative ring we cannot guarantee that the constant polynomials have inverses ... so on this ground alone R[x] can not be a field ... ... unless of course, R is a field in which case the constant polynomials will have inverses ... but then ...

Secondly, polynomials such as $$ p(x) = x \text{ or } x^2 \text{ or } x^3 \text{ etc. } $$ would require 'polynomials' like $$ x^{-1} \text{ or } x^{-2} \text{ or } x^{-3} \text{ etc. } $$ and these do not come under the definition of polynomials ...

Can someone please confirm that the above analysis presents an answer to Rotman's problem 2.2)? I would appreciate any critique of the "proof".

Peter
 
Claim: $x$ has no inverse in $R[x]$, if $R \neq \{0\}$.

Proof (by contradiction):

Suppose it did, so we have some polynomial:

$a_0 + a_1x + \cdots + a_nx^n \in R[x]$ with:

$x(a_0 + a_1x + \cdots + a_nx^n) = 1$, that is:

$0 + a_0x + a_1x^2 + \cdots + a_nx^{n+1} = 1 + 0x + 0x^2 + \cdots + 0x^{n+1}$.

Equating coefficients (which we can do since $R$ is commutative, and thus we can view $R[x]$ as an $R$-module with BASIS: $\{1,x,x^2,\dots\}$-recall that commutative rings have the invariant basis property), we have:

$a_0 = a_1 = \dots = a_n = 0$,
$0 = 1$,

which means that $R[x]$ is a TRIVIAL ring, contradiction.

(and this means, yes, you're right!).
 
Deveno said:
Claim: $x$ has no inverse in $R[x]$, if $R \neq \{0\}$.

Proof (by contradiction):

Suppose it did, so we have some polynomial:

$a_0 + a_1x + \cdots + a_nx^n \in R[x]$ with:

$x(a_0 + a_1x + \cdots + a_nx^n) = 1$, that is:

$0 + a_0x + a_1x^2 + \cdots + a_nx^{n+1} = 1 + 0x + 0x^2 + \cdots + 0x^{n+1}$.

Equating coefficients (which we can do since $R$ is commutative, and thus we can view $R[x]$ as an $R$-module with BASIS: $\{1,x,x^2,\dots\}$-recall that commutative rings have the invariant basis property), we have:

$a_0 = a_1 = \dots = a_n = 0$,
$0 = 1$,

which means that $R[x]$ is a TRIVIAL ring, contradiction.

(and this means, yes, you're right!).

Thanks Deveno ... Definitely needed help with the formal proof ... Appreciate the help

Peter
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
5
Views
933
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
9
Views
2K