Dimension of Hom(K)(U,V) and Basis of the Vector Space

mathmathmad
Messages
50
Reaction score
0

Homework Statement


Let U and V be vector spaces of dimensions of n and m over K and let Hom(subscriptK)(U,V) be the vector space over K of all linear maps from U to V. Find the dimension and describe a basis of Hom(subscriptK)(U,V). (You may find it helpful to use the correspondence with mxn matrices over K)


Homework Equations





The Attempt at a Solution


is Hom(subscriptK)(U,V) a matrix that maps U to V?
I don't get what Hom(subscriptK)(U,V) is...
 
Physics news on Phys.org
Hom_K(U,V) isn't 'a' matrix mapping U to V. It's the set of ALL matrices mapping U to V. K indicates the field the entries in the matrices are from (i.e. real, complex, etc, etc).
 
How does one describe the basis in this case?
 
vintwc said:
How does one describe the basis in this case?

Pick a basis {u1...un} for U and a basis {v1...vm} for V and think about a how to make a simple set of independent linear transformations whose span is all linear transformations.
 
is the dimension of Hom_K(U,V) = mxn?
 
mathmathmad said:
is the dimension of Hom_K(U,V) = mxn?

Yes. Can you show that by producing a basis?
 
^ no... can you please show me how?
 
Last edited:
mathmathmad said:
^ no... can you please show me how?

Describe SOME linear transformation from U->V in terms of the basis. ANY one.
 
I am trying to prove this without using matrix form. I am using the transformation:
<br /> $ T_{ij} $ ( i= 1, \ldots, n and j= 1,\ldots,m) as the linear transformation that does the following: \\<br /> $ u_i \mapsto v_j \quad \text{ and } \quad u_{k \neq i} \mapsto 0 $<br />
However I do not know where to start in proving these transformations are linearly independent. I am used to dealing with vectors and doing this kind of thing with a vector space of linear transformations is throwing me off.
 
Last edited:
  • #10
Ok, so linear independence means that sum over all i and j of a_ij*T_ij=0 implies ALL a_ij=0, right? Suppose a_KL is not zero. Put u_K into the transformation sum(a_ij*T_ij). That turns it sum(a_ij*T_ij(u_K)). What does that look like if you simplify to a single sum over j?
 
  • #11
Excellent, I understand much better how to work with linear combinations of these transformations. I get it now. Thanks
 
  • #12
Oh one last question, if I wanted to use einstein summation notation here could I just leave off both sum symbols?
ie. is
<br /> \alpha_{ij}T_{ij} = 0<br />
the same as
<br /> \sum_{i=1}^{n} \sum_{j=1}^{m} \alpha_{ij}T_{ij} = 0<br />
using einstein summation notation?
 
  • #13
LogicalTime said:
Oh one last question, if I wanted to use einstein summation notation here could I just leave off both sum symbols?
ie. is
<br /> \alpha_{ij}T_{ij} = 0<br />
the same as
<br /> \sum_{i=1}^{n} \sum_{j=1}^{m} \alpha_{ij}T_{ij} = 0<br />
using einstein summation notation?

Right. That's actually what I was writing. I put the 'sum' into make the sums were understood. Be sure you say you are using the Einstein summation convention, though. That's not automatically understood.
 
  • #14
cool, thanks again!
 
Back
Top