MHB Demonstrating that a mapping is injective

  • Thread starter Thread starter Math Amateur
  • Start date Start date
  • Tags Tags
    Injective Mapping
Click For Summary
The discussion revolves around demonstrating the isomorphism $$ Hom_R(R, X) \cong X $$ as presented in Dummit and Foote's Proposition 28. Participants are focused on proving that the mapping $$ \theta(f) = f(1_R) $$ is both injective and surjective. It is established that $$ \theta $$ is injective since the kernel consists solely of the zero map. However, proving surjectivity remains a challenge, with suggestions to consider the submodule generated by elements of X. The conversation emphasizes the need for a specific homomorphism that maps elements from R to X effectively.
Math Amateur
Gold Member
MHB
Messages
3,920
Reaction score
48
I am reading Dummit and Foote, Section 10.5 : Exact Sequences - Projective, Injective and Flat Modules.

I am studying Proposition 28 (D&F pages 387 - 388)

In the latter stages of the proof of Proposition 28 we find the following statement (top of page 388):

"In general, $$ Hom_R (R, X) \cong X $$, the isomorphism being given by mapping a homomorphism to its value on the element $$1 \in R $$"

I am having some trouble in demonstrating the isomorphism involved in the relationship $$ Hom_R (R, X) \cong X $$.

To demonstrate the isomorphism I proceeded as follows:

Let $$f, g \in Hom_R (R,X) $$ so $$ f,g : \ R \to X $$

Consider $$ \theta \ : \ Hom_R (R,X) \to X $$

where $$ \theta (f) = f(1_R) $$

To show $$ \theta $$ is a homomorphism we proceed as follows:

$$ \theta (f + g) = (f + g)(1_R) = f(1_R) + g(1_R) $$

$$ = \theta (f) + \theta (g) $$

and

$$ \theta (rf) = (rf) = rf (1_R) = r \theta (f) $$ where $$ r \in R $$

Then I need to show $$ \theta $$ is injective and surjective.

BUT ... I am having problems in demonstrating that $$ \theta $$ is injective ... can someone help me with this task?Note that I suspect one would proceed as follows:

Suppose we have $$ f, g \in Hom_R (R,X) $$ such that:

$$ \theta (f) = f(1_R) $$ and $$ \theta (g) = f(1_R) $$

Now we have, of course, by definition of g, that $$ \theta (g) = g(1_R) $$

So $$ f(1_R) = g(1_R) $$ ... but how do we proceed from here to show that f = g?

Hope someone can help.

Peter
 
Last edited:
Physics news on Phys.org
Suppose $f \in \text{ker }\theta$.

By definition, this means:

$f(1_R) = 0_X$.

Hence, for any $r \in R$, we have:

$f(r) = f(r\cdot1_R) = r\cdot f(1_R) = r\cdot 0_X= 0_X$

which show the kernel consists solely of the 0-map, and is thus injective.

It seems to me surjectivity is more of a problem, we need to prove there IS $f \in \text{Hom}_R(R,X)$ with:

$f(1_R) = x$, for every $x \in X$. I suggest looking at the submodule generated by $x$.
 
Deveno said:
Suppose $f \in \text{ker }\theta$.

By definition, this means:

$f(1_R) = 0_X$.

Hence, for any $r \in R$, we have:

$f(r) = f(r\cdot1_R) = r\cdot f(1_R) = r\cdot 0_X= 0_X$

which show the kernel consists solely of the 0-map, and is thus injective.

It seems to me surjectivity is more of a problem, we need to prove there IS $f \in \text{Hom}_R(R,X)$ with:

$f(1_R) = x$, for every $x \in X$. I suggest looking at the submodule generated by $x$.

Thanks Deveno ...

You are certainly right that proving $$ \theta $$ is surjective is a problem ... I am having problems proving it ... even given your suggestion ...

Can you help?

Peter
 
There is a reason that I suggested the submodule generated by $x$.

Note that if $Y \subseteq X$ is a submodule, then $\text{Hom}_R(R,Y) \subseteq \text{Hom}_R(R,X)$.

The obvious candidate is: $g(r) = r\cdot x$, which maps $R \to \langle x\rangle$.
 
Deveno said:
There is a reason that I suggested the submodule generated by $x$.

Note that if $Y \subseteq X$ is a submodule, then $\text{Hom}_R(R,Y) \subseteq \text{Hom}_R(R,X)$.

The obvious candidate is: $g(r) = r\cdot x$, which maps $R \to \langle x\rangle$.

Thanks Deveno ... Reflecting on your post now ... ...

Peter
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
918
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
848
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 0 ·
Replies
0
Views
810
  • · Replies 2 ·
Replies
2
Views
2K