mattmns
- 1,121
- 5
I have this question, just curious if what I did is correct.
----
Let A and R be rings, and let f:A->R be a function satisfying:
f(a+b) = f(a)+f(b) and f(ab) = f(a)f(b).
Prove that if f is surjective, then f(1) = 1.
-----
Note: I will use * for multiplication for clarity where things would have just otherwise been next to each other.
Also, note that A and R are commutative rings, both of which have a multiplicative identity 1.
First, R is a ring, so 1 is in R. Also, because this function is surjective, there must be some a in A, that satisfies f(a) = 1. So, we can rewrite f(a) = 1 as: 1 = f(a*1) = f(a)f(1) = 1*f(1) [because f(a) = 1] = f(1) = 1.
That looks sufficient to me, but I don't like it when I don't use something I could have (ie, the + part of the function). Does this look correct? Thanks.
----
Let A and R be rings, and let f:A->R be a function satisfying:
f(a+b) = f(a)+f(b) and f(ab) = f(a)f(b).
Prove that if f is surjective, then f(1) = 1.
-----
Note: I will use * for multiplication for clarity where things would have just otherwise been next to each other.
Also, note that A and R are commutative rings, both of which have a multiplicative identity 1.
First, R is a ring, so 1 is in R. Also, because this function is surjective, there must be some a in A, that satisfies f(a) = 1. So, we can rewrite f(a) = 1 as: 1 = f(a*1) = f(a)f(1) = 1*f(1) [because f(a) = 1] = f(1) = 1.
That looks sufficient to me, but I don't like it when I don't use something I could have (ie, the + part of the function). Does this look correct? Thanks.