Re: Intersection of all subspace of V is the empty set
I like Serena said:
Either way, linear dependence has to be the opposite of linear independence.
So I guess you're proposing to switch the definitions around?
That could work for me, especially since the current wiki definition doesn't have a reference.
Still, we do need to get to a definition that is consistent and complete.
I like Serena said:
As I see it, either we accept {0} as a basis, and accept it as being a linearly independent set, which also keeps
span consistent.
Or we modify basis to mean that it only has non-zero vectors, modify linear span to always include the zero-vector, and modify linear independence to be as you suggested, and make linear dependence its opposite.
Both would fix the inconsistencies wouldn't they? (Wondering)
I don't think there currently are any inconsistencies in the standard literature definitions of "linear independence", "linear dependence", "span" or "basis", nor do I think any modifications are required. Given a vector space $V$ over $\mathbb{K} = \mathbb{R}$ or $\mathbb{K} = \mathbb{C}$,
1. The vectors $\mathbf{x}_1,\ldots,\mathbf{x}_n$ in $V$ are defined to be
linearly independent if the equation
\[
c_1\mathbf{x}_1 + \cdots + c_n\mathbf{x}_n = \mathbf{0} \qquad (\ast)
\]
only has the trivial solution $c_1 = \cdots = c_n = 0$. (This is what I wrote in post #4.)
2. The above vectors are defined to be
linearly dependent if at least one nontrivial solution $(c_1,\ldots,c_n) \in \mathbb{K}^n$ of $(\ast)$ exists.
So, the above standard definitions are indeed complementary.
If the above vectors $\mathbf{x}_1,\ldots,\mathbf{x}_n$ are independent and they span $V$, then by definition they are a
basis for $V$.
I think that, as far as the definitions are concerned, there is nothing more to it. In particular, I do not know of any reference that regards $\mathbf{0}$ by itself as an independent vector. It would contradict the above definitions. (Moreover, but less importantly, it would mess up a lot of results. For example, "A square matrix is invertible if and only if its columns are linearly independent" would no longer be true: The $1 \times 1$ zero matrix would be a counterexample.)
Ackbach said:
Wouldn't any linearly independent set be a basis for $\{0\}?$ Suppose your linearly independent set is $\{x_1,x_2,\dots,x_n\}$. Then the linear combination $0\cdot x_1+0\cdot x_2+\cdots+0\cdot x_n=0$, and thus it spans the space.
No, the linearly independent set has to be a subset of the vector space for which it is going to be a basis. (Otherwise, one gets strange things: the vectors $(1,0)^T$ and $(0,1)^T$ in $\mathbb{R}^2$ would form a basis for the trivial subspace $\{(0,0,0)^T\}$ of $\mathbb{R}^3$.)