- 3,148
- 8
I need to check the proof of the proposition below we got for homework, thanks in advance!
Proposition. Let V be a vector space over a field F, and S = \left\{a_{1}, \cdots, a_{k}\right\}\subset V, k\geq 2. If the set S is linearly dependent, and a_{1} \neq 0, and if we assume there is an order on S, then there exists at least one element of S which can be shown as a linear combination of its predecessors.
Proof [?]. If S is dependent, then there exists an element in S that can be shown as a linear combination of the rest of vectors from S, so:
a_{j+1}=\alpha_{1}a_{1}+\cdots+\alpha_{j}a_{j}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}. (*) Further on, let's assume that the vector a_{j} can be shown as:
a_{j}=\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1}. So, after plugging a_{j} into the equation (*), we get:
a_{j+1}= \alpha_{1}a_{1}+\cdots+\alpha_{j}(\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1})+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}, which implies a_{j+1}=\gamma_{1}a_{1}+\cdots+a_{j-1}\gamma_{j-1}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}. We assumed that a_{j+1} is a the combination of all vectors in S, so, since the linear combination does not explicitly contain the vector a_{j}, we conclude that a_{j} must be a linear combination of the vectors \left\{a_{1}, \cdots, a_{j-1}\right\} with the coefficients \gamma_{i}.
Gee, I have the feeling I missed something big here.
P.S. The thread should be called 'Linear dependence proof' or sth like that, but nevermind.
Proposition. Let V be a vector space over a field F, and S = \left\{a_{1}, \cdots, a_{k}\right\}\subset V, k\geq 2. If the set S is linearly dependent, and a_{1} \neq 0, and if we assume there is an order on S, then there exists at least one element of S which can be shown as a linear combination of its predecessors.
Proof [?]. If S is dependent, then there exists an element in S that can be shown as a linear combination of the rest of vectors from S, so:
a_{j+1}=\alpha_{1}a_{1}+\cdots+\alpha_{j}a_{j}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}. (*) Further on, let's assume that the vector a_{j} can be shown as:
a_{j}=\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1}. So, after plugging a_{j} into the equation (*), we get:
a_{j+1}= \alpha_{1}a_{1}+\cdots+\alpha_{j}(\beta_{1}a_{1}+\cdots+\beta_{j-1}a_{j-1})+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}, which implies a_{j+1}=\gamma_{1}a_{1}+\cdots+a_{j-1}\gamma_{j-1}+\alpha_{j+2}a_{j+2}+\cdots+\alpha_{k}a_{k}. We assumed that a_{j+1} is a the combination of all vectors in S, so, since the linear combination does not explicitly contain the vector a_{j}, we conclude that a_{j} must be a linear combination of the vectors \left\{a_{1}, \cdots, a_{j-1}\right\} with the coefficients \gamma_{i}.
Gee, I have the feeling I missed something big here.

P.S. The thread should be called 'Linear dependence proof' or sth like that, but nevermind.
Last edited: