# A subspace question

Homework Helper
Let V be a vector space over a field F, and M a subspace of V, where M is not {0}. I need to show there exists a basis for V such that none of its elements belong to M.

Since M is a subspace of V, M must be a subset of V. If M = V, then there does not exist such a basis, so M must be a proper subset of V. Hence, there must exist at least one element b1 from V which is not in M. Further on, the set {b1} is independent. Now, if we consider the span [{b1}], which is a subset of V, we have two options. If [{b1}] = V, then {b1} is a basis, and we proved what we had to. If it is not so, then [{b1}] is a proper subset of V, and there (here's the tricky part) exists (?) at least one element b2 from V \ (M U [{b1}]). If I could proove the existence, I'd know how to carry on. I tried to assume the opposite - there does not exist an element b2 from V \ (M U [{b1}]). This implies that b2 must be in M. But then, {b1} should form a basis for V, which it does not, so we have a contradiction (?).

Directions would be appreciated, thanks in advance.

Interesting that your initial statement doesn't include M=/=V, since that's obviously a requirement.

Even if you find a b2, you have to show it's linearly independent of b1. I wouldn't go the contradiction route, since it seems like far too much work.

You know that if M does not equal V, then dimM < dimV, right (strictly less than)? So you can construct a basis of M, and extend it to be a basis of V. The vectors in the new basis that were just added are not elements of M, which is important. So say the basis of M is {$$m_1, m_2, ... m_k$$} and the basis of V is {$$m_1, m_2, ... ,m_k, v_1,...v_n$$}. Given this, can you find a way of replacing the m's with vectors that aren't in M (which has to be closed under addition, a big hint) such that the set is still a basis?

Yeah, with the hint above, I have an idea:

Suppose dimM < dimV and M has a basis {m1, m2, ..., mk}. Add to this basis some vectors called v1, v2,..., vn such that {m1, m2,...mk, v1, v2,..., vn} becomes a basis of V. Of course, v1, v2,..., vn are not in M.

Define u1 = m1 + v1, u2 = m2 + v1, ..., uk = mk + v1

By using definiton, we can prove the vector system {u1, u2,..., uk, v1, v2,...,vn} is still independent in V, thus it is a basis of V.

However all elements of this new basis of V are not in M. The proff is complete. :)

Office_Shredder said:
Interesting that your initial statement doesn't include M=/=V, since that's obviously a requirement.

It's not interesting, since it isn't my statement.

Office_Shredder said:
Even if you find a b2, you have to show it's linearly independent of b1.

Well, if b2 is not in [{b1}], then it is independent of b1, since there doesn't exist some a from F such that a*b1 = b2.

Thank you both for the other hints, I'll think about it.

Well, if b2 is not in [{b1}], then it is independent of b1, since there doesn't exist some a from F such that a*b1 = b2.

That's true now that I think about it :rofl:

And by your statement, I meant the statement that you had, i.e. that someone gave you