Dimension of Hom(K)(U,V) and Basis of the Vector Space

Click For Summary
SUMMARY

The dimension of the vector space Hom_K(U,V), which consists of all linear maps from vector space U of dimension n to vector space V of dimension m, is definitively m*n. This conclusion is supported by the correspondence between Hom_K(U,V) and the set of mxn matrices over the field K. A basis for this vector space can be constructed using the linear transformations T_{ij}, where each transformation maps basis vectors from U to basis vectors in V while sending all other basis vectors to zero. The linear independence of these transformations can be established through the linear combination of transformations leading to the zero transformation.

PREREQUISITES
  • Understanding of vector spaces and their dimensions
  • Familiarity with linear transformations and their properties
  • Knowledge of matrix representation of linear maps
  • Concept of linear independence in vector spaces
NEXT STEPS
  • Study the properties of linear transformations in vector spaces
  • Learn about the Einstein summation convention and its applications
  • Explore the relationship between linear maps and matrices in detail
  • Investigate examples of bases for Hom_K(U,V) with specific vector spaces
USEFUL FOR

Mathematics students, particularly those studying linear algebra, educators teaching vector spaces, and researchers interested in linear transformations and their applications.

mathmathmad
Messages
50
Reaction score
0

Homework Statement


Let U and V be vector spaces of dimensions of n and m over K and let Hom(subscriptK)(U,V) be the vector space over K of all linear maps from U to V. Find the dimension and describe a basis of Hom(subscriptK)(U,V). (You may find it helpful to use the correspondence with mxn matrices over K)


Homework Equations





The Attempt at a Solution


is Hom(subscriptK)(U,V) a matrix that maps U to V?
I don't get what Hom(subscriptK)(U,V) is...
 
Physics news on Phys.org
Hom_K(U,V) isn't 'a' matrix mapping U to V. It's the set of ALL matrices mapping U to V. K indicates the field the entries in the matrices are from (i.e. real, complex, etc, etc).
 
How does one describe the basis in this case?
 
vintwc said:
How does one describe the basis in this case?

Pick a basis {u1...un} for U and a basis {v1...vm} for V and think about a how to make a simple set of independent linear transformations whose span is all linear transformations.
 
is the dimension of Hom_K(U,V) = mxn?
 
mathmathmad said:
is the dimension of Hom_K(U,V) = mxn?

Yes. Can you show that by producing a basis?
 
^ no... can you please show me how?
 
Last edited:
mathmathmad said:
^ no... can you please show me how?

Describe SOME linear transformation from U->V in terms of the basis. ANY one.
 
I am trying to prove this without using matrix form. I am using the transformation:
<br /> $ T_{ij} $ ( i= 1, \ldots, n and j= 1,\ldots,m) as the linear transformation that does the following: \\<br /> $ u_i \mapsto v_j \quad \text{ and } \quad u_{k \neq i} \mapsto 0 $<br />
However I do not know where to start in proving these transformations are linearly independent. I am used to dealing with vectors and doing this kind of thing with a vector space of linear transformations is throwing me off.
 
Last edited:
  • #10
Ok, so linear independence means that sum over all i and j of a_ij*T_ij=0 implies ALL a_ij=0, right? Suppose a_KL is not zero. Put u_K into the transformation sum(a_ij*T_ij). That turns it sum(a_ij*T_ij(u_K)). What does that look like if you simplify to a single sum over j?
 
  • #11
Excellent, I understand much better how to work with linear combinations of these transformations. I get it now. Thanks
 
  • #12
Oh one last question, if I wanted to use einstein summation notation here could I just leave off both sum symbols?
ie. is
<br /> \alpha_{ij}T_{ij} = 0<br />
the same as
<br /> \sum_{i=1}^{n} \sum_{j=1}^{m} \alpha_{ij}T_{ij} = 0<br />
using einstein summation notation?
 
  • #13
LogicalTime said:
Oh one last question, if I wanted to use einstein summation notation here could I just leave off both sum symbols?
ie. is
<br /> \alpha_{ij}T_{ij} = 0<br />
the same as
<br /> \sum_{i=1}^{n} \sum_{j=1}^{m} \alpha_{ij}T_{ij} = 0<br />
using einstein summation notation?

Right. That's actually what I was writing. I put the 'sum' into make the sums were understood. Be sure you say you are using the Einstein summation convention, though. That's not automatically understood.
 
  • #14
cool, thanks again!
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 58 ·
2
Replies
58
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K