MHB Are Hom_R(R, M) and M Isomorphic?

  • Thread starter Thread starter mathmari
  • Start date Start date
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :o

Let $R$ be a commutative ring and $M$ be a $R$-module.
I want to show that $\text{Hom}_R(R,M)\cong M$.

I have done the following:

We consider the mapping $\phi : \text{Hom}_R(R,M)\rightarrow M$ with $f\mapsto f(1_R)$.

Let $f,g\in \text{Hom}_R(R,M)$.
We have that
$$\Phi (f+g)=(g+g)(1_R)=f(1_R)+g(1_R)=\Phi (f)+\Phi (g) \\ \Phi (af)=(af)(1_R)=af(1_R)=a\phi (f)$$
So, $\Phi$ is an homomorphism.

Let $\Phi (f)=\Phi (g)$. Then $f(1_R)=g(1_R)$.
So, $f(r)=rf(1_R)=rg(1_R)=g(r), \forall r\in R$.
Therefore, $\phi$ is 1-1.

For each $y\in M$ we define $f$ as follows:
$f: R\rightarrow M$ with $f(r)=ry$
So, for each $y\in M$ we have that $\Phi (f)=f(1_R)=y$.
Therefgore, $\Phi$ is onto.

Is everything correct? (Wondering)
 
Physics news on Phys.org
It's all correct! (Nod)
 
Great! (Smile)

I have also an other question... Does it stand that $$\text{Hom}_R(R^n,M)\cong \prod_{I=0}^n \text{Hom}_R(R,M)$$ ? (Wondering)
 
Almost. You need to change $I = 0$ to $I = 1$.

I will use the symbol $\operatorname{Hom}_R(R,M)^n$ to represent $\prod_{i = 1}^n \operatorname{Hom}_R(R,M)$. For $1 \le j \le n$, let $e_j\in R^n$ be the element with $1$ in the $j$th coordinate and zeros elsewhere; let $i_j : R \to R^n$ be the $j$th inclusion mapping $r \mapsto re_j$. Since the inclusion mappings are $R$-linear, there is an $R$-linear mapping

$$\Phi : \operatorname{Hom}_R(R^n,M) \to \operatorname{Hom}_R(R,M)^n$$

given by $\Phi(f) = (f\circ i_1,\ldots, f\circ i_n)$. I claim that the mapping

$$\Psi : \operatorname{Hom}_R(R,M)^n \to \operatorname{Hom}_R(R^n,M)$$

defined by the equation $\Psi(g_1,\ldots, g_n) = g_1\circ \pi_1 + \cdots +g_n\circ \pi_n$ is the inverse of $\Psi$. Here, $\pi_i : R^n \to R$ ($1 \le i \le n$) is the projection mapping $\pi_i(r_1,\ldots, r_n) = r_i$. It's important to note that $\pi_i \circ i_j$ is zero for $i \neq j$ and $\operatorname{id}_R$ for $i = j$. Indeed, for all $r\in R$, $(\pi_i \circ i_j)(r) = \pi_i(re_j) = r\delta_{ij}$, which is equal to $0$ if $i \neq j$ and $r$ when $i = j$. Now

$$\Phi(\Psi(g_1,\ldots, g_n)) = \Phi(g_1 + \cdots + g_n) = \left(\sum_{k = 1}^n g_k \circ i_1 \circ \pi_k, \ldots, \sum_{k = 1}^n g_k \circ i_n \circ \pi_k\right) = \left(\sum_{k = 1}^n g_k\delta_{1k},\ldots, \sum_{k = 1}^n g_n \delta_{nk}\right) = (g_1,\ldots, g_n)$$

and

$$\Psi(\Phi(f)) = \Psi(f\circ i_1,\ldots, f\circ i_n) = f\circ i_1\circ \pi_1 + \cdots + f\circ i_n \circ \pi_n = f\circ (i_1 \circ \pi_1 + \cdots + i_n \circ \pi_n) = f$$

where the last identity follows from the fact that $i_1 \circ \pi_1 + \cdots + i_n \circ \pi_n$ is the identity on $R$:

$$(i_1 \circ \pi_1 + \cdots + i_n \circ \pi_n)(r_1,\ldots r_n) = i_1(r_1) + \cdots + i_n(r_n) = r_1 e_1 + \cdots + r_n e_n = (r_1,\ldots, r_n)$$

Therefore $\Psi$ is the inverse of $\Phi$ and $\Phi$ is a bijection. Since $\Phi$ is also $R$-linear, it is an $R$-module isomorphism.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 5 ·
Replies
5
Views
839
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 26 ·
Replies
26
Views
738
  • · Replies 13 ·
Replies
13
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
994