jimmypoopins
- 64
- 0
Homework Statement
Suppose that V and W are finite dimensional and that U is a subspace of V. Prove that there exists T \in L(V,W) such that null T = U if and only if dim U \geq dim V - dim W.
Homework Equations
thm: If T \in L(V,W), then range T is a subspace of W.
thm: If V is a finite dimensional vector space and T \in L(V,W) then range T is a finite-dimensional subspace of W and dim V = dim null T + dim range T.
The Attempt at a Solution
forward direction: by thm, range T is a subspace of W implies that
dim range T \leq dim range W.
by thm, dim V = dim null T + dim range T
dim V = dim U + dim range T (since U = null T)
dim V - dim range T = dim U
dim V - dim W \leq dim U since dim range T \leq dim range W.
i think the forward direction is good. comments?
backward direction:
we have dim V - dim W \leq dim U. Let (u_{1},...,u_{n}) be a basis for U. extend this to a basis for V: (u_{1},...,u_{n},u_{n+1},...u_{m}). then dim U = n, and dim V = m. Then any v \in V can be written as a_{1}u_{1}+...+a_{m}u_{m}.
I think I'm in the right direction but I'm confused as to what to do. since we have dim V - dim W is less than dim U, i want to say that dim W is greater than or equal to m, but i don't know how to define T so that null T = U. If i make all of the T(u_i} in the basis 0, then null T = U, but how does that relate to the relation of dim V - dim W \leq dim U?
thanks.