Prove infinitely many left inverses

  • Thread starter Thread starter vintwc
  • Start date Start date
vintwc
Messages
24
Reaction score
0

Homework Statement



Let V be a vector space over K. Let L(V) be the set of all linear maps V->V. Prove that L(V) is a ring under the operations:
f+g:x -> f(x)+g(x) and fg:x -> f(g(x))

Now, let V=U+W be the direct sum of two vector spaces over K such that the dimension of both U and W are countable. Then V has countable dimension. Choosing a linear bijection between U and V gives us an element f:V->U of L(V). Prove that there are infinitely many x \in R = L(V) such that xf=1_R. Prove that there is no y \in R such that fy=1_R.

Homework Equations


Direct sum of two vector spaces U and W is the set U+W of pairs of vectors (u,w) in U and W with operations:
a(u,w)+b(u'+w')=(au+bu',aw+bw')

The Attempt at a Solution


For the first bit, I managed to show that L(V) is indeed a ring. In the second part, I'm not sure how to approach this problem. Should I define a bijective function f such that xf=1_R? Also, is linear bijection essentially means an isomorphism?
 
Physics news on Phys.org
ok so linear bijection is an isomorphism. i define f(v_1,v_2)=(u(v_1,v_2),0) but I'm still not sure how to proceed from there.
 
Pick bases for U, {u_i} and W, {w_i}. A basis for V, {v_i} is then the union of the two. And the two are disjoint. That's really what direct sum means in this case. You don't have to worry about the ordered pair definition business. Let's also pick the bases so that f(v_i)=u_i is your bijection. Do you see it now? In the case xf=1_R, f maps everything 1-1 onto U. To undo that, you just have to make sure everything in U goes back to the corresponding vector in V. What you define x to be for elements of W doesn't matter (hence the infinite number). Can you see why fy can't be 1_R?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top