1. The problem statement, all variables and given/known data Prove the following: Let V be a vector space and assume there is an integer n such that if (v1, . . . , vk) is a linearly independent sequence from V then k ≤ n. Prove is (v1, . . . , vk) is a maximal linearly independent sequence from V then (v1, . . . , vk) spans V and is therefore a basis. 2. Relevant equations 3. The attempt at a solution If v_1,.....,v_k$spans V then all vectors in V are generated by some linear combination of v_1,...,v_k. It's clearly seen that we can generate any vector in the sequence by setting the constant of the desired vector to 1 and the others to 0. Hence the sequence is maximal linearly independent, adding another vector will provoke a dependency. The dependency didn't exist before the new vector was added. This implies that the added vector can be written as a linear combination of the other vectors. If we do this for every remaining vector in V, then all vectors can be written as a linear combination of the given sequence and therefore it spans V. The sequence is linearly independent and it spans V, so it's a basis. is that correct?