Number of indie vectors ##\leq ## cardinality of spanning set

Click For Summary
SUMMARY

The discussion centers on the relationship between the number of linearly independent vectors and the cardinality of a spanning set in finite-dimensional vector spaces. It is established that if a set of vectors is linearly independent, its length cannot exceed that of any spanning set. The proof involves assuming a contradiction where the number of linearly independent vectors exceeds that of the spanning set, leading to logical inconsistencies. The conclusion drawn is that the number of linearly independent vectors must be less than or equal to the number of vectors in the spanning set.

PREREQUISITES
  • Understanding of linear independence and spanning sets in vector spaces
  • Familiarity with finite-dimensional vector space concepts
  • Knowledge of linear equations and matrix representation
  • Proficiency in proof techniques, particularly proof by contradiction
NEXT STEPS
  • Study the properties of vector spaces and linear transformations
  • Learn about the Rank-Nullity Theorem in linear algebra
  • Explore matrix representations of linear transformations
  • Investigate the concept of bases and dimension in vector spaces
USEFUL FOR

Students of linear algebra, mathematicians, and educators looking to deepen their understanding of vector space theory and its implications in higher mathematics.

Terrell
Messages
316
Reaction score
26

Homework Statement


In a finite-dimensional vector space, the length of every linearly independent list of vectors is less than or equal to the length of every spanning list.
It's quite long :nb), hope you guys read through it. Thanks! :smile:

Homework Equations


N/A

The Attempt at a Solution


Let ##\{v_1, \cdots, v_m\}## be linearly independent and ##\{u_1, \cdots, u_m\}## spans V. Note that ##\{v_1, \cdots, v_m\} \subseteq \operatorname{span}(u_1, \cdots, u_n)## which implies ##\forall i=1,\cdots,m## ##v_i=\sum_{j=1}^{n}a_{j}u_{j}##. Suppose that ##n \lt m## and let ##v_1=\sum_{i=1}^{m}a_{i}u_{i}##,##\quad## ##v_2=\sum_{i=1}^{m}b_{i}u_{i}##,##\quad## ##\cdots##, ##\quad## ##v_3=\sum_{i=1}^{m}\eta_{i}u_{i}##. Observe that
\begin{align}
\delta_{1}v_1+\delta_{2}v_2+\cdots +\delta_{n}v_{n}=v_k; \quad k \in \{n+1,\cdots ,m\}\\
\Longleftrightarrow \delta_{1}(\sum_{i=1}^{m}a_{i}u_{i})+\delta_{2}(\sum_{i=1}^{m}b_{i}u_{i})+\cdots +\delta_n(\sum_{i=1}^{m}\eta_{i}u_{i}) &=v_k\\
\Longleftrightarrow \delta_1(a_{1}u_{1}+\cdots +a_{m}u_{m})+\delta_2(b_{1}u_{1}+\cdots +b_{m}u_{m})+\cdots +\delta_n(\eta_{1}u_{1}+\cdots +\eta_{m}u_{m}) &=v_k\\
\Longleftrightarrow (\delta_{1}a_{1}+\cdots +\delta_{n}\eta_{1})u_{1}+\cdots +(\delta_{1}a_{m}+\cdots +\delta_{n}\eta_{m})u_{m} &=v_{k}
\end{align}
Since ##\exists \xi_{i}\in\Bbb{F}## such that ##\xi_{1}u_1+\cdots +\xi_{m}u_m=v_{k}##, then by solving the following system of linear equations
\begin{align}
\delta_{1}a_{1}+\cdots +\delta_{n}\eta_{1} &= \xi_1\\
\vdots\\
\delta_{1}a_{m}+\cdots +\delta_{n}\eta_{m} &= \xi_{m}
\end{align}
we obtain the necessary ##\delta_{i}'s## so that ##\sum_{i=1}^{n}\delta_{i}v_{i}=v_k##. Since ##n \neq m##, there could either be non-unique solutions or no solutions. We consider the two scenarios.
Case 1: infinite solutions; By definition of a basis, ##\{v_1,\cdots ,v_n\}## cannot form a basis due to the required uniqueness of expressing ##v_k## with respect to the set of vectors ##\{v_1,\cdots ,v_n\}##. But ##\operatorname{span}(v_1, \cdots, v_n) \neq V## so it must be that ##\{v_1, \cdots , v_n\}## is not linearly independent. We have a contradiction.
Case 2: No solution; then ##v_k \notin \operatorname{span}(v_1, \cdots, v_n)## which implies ##\delta_{1}v_1 + \delta_{2}v_2 + \cdots + \delta_{n}v_n + \delta_{n+1}v_k = 0## if and only if all ##\delta_{i}'s## equal zero. Equivalently,
\begin{align}
& \Longleftrightarrow (\delta_{1}a_{1}+\cdots +\delta_{n}\eta_{1}+\delta_{n+1}\xi_{1})u_{1}+\cdots +(\delta_{1}a_{m}+\cdots +\delta_{n}\eta_{m}+\delta_{n+1}\xi_{m})u_{m} = 0\\
& \Longleftrightarrow \forall i=1,\cdots, m \quad \delta_{1}a_i+\cdots +\delta_{n}\eta_i+\delta_{n+1}\xi_i =0
\end{align}
But, if we let ##\delta_{n+1}=\frac{-(\delta_{1}a_i+ \cdots +\delta_{n}\eta_i)}{\xi_i}##, ##\forall i=1, \cdots, m##, then ##\exists \delta_i \neq 0## such that ##(\sum_{i=1}^{n}\delta_{i}v_i)+\delta_{n+1}v_k=0##. This implies the set of vectors ##\{v_1, v_2\cdots,v_n, v_k\}## is not linearly independent. We have another contradiction.

##\therefore## ##m \leq n##.o0)
 
Physics news on Phys.org
It might be easier to assume that you have a set of m lin. independent vectors and a spanning set with n vectors, with m > n. If you get a contradiction (which should be easier to do than what you did), you're done.

Also, it looks like you have a typo in the first line of your proof. The two sets -- the linearly independent vectors and the spanning set, shouldn't have the same final index of m.
 
  • Like
Likes   Reactions: Terrell
Mark44 said:
it looks like you have a typo in the first line of your proof.
Yes. It should be ##\{u_1,\cdots, u_n\}##
Mark44 said:
It might be easier to assume that you have a set of m lin. independent vectors and a spanning set with n vectors, with m > n.
I believe this is what I did. At line 2, my third sentence starts with suppose ##m \lt n##. Then I got a contradiction in the end. But as what you've probably already notice, my proof seems more complicated than it should be. So I have a feeling that I might have an incorrect use of contradiction. :nb)
 
Terrell said:
Yes. It should be ##\{u_1,\cdots, u_n\}##

I believe this is what I did. At line 2, my third sentence starts with suppose ##m \lt n##.
Actually what you wrote is "Suppose that n < m..." which is correct for a proof by contradiction if your spanning set is ##\{u_1, u_2, \dots, u_n \}##
Terrell said:
Then I got a contradiction in the end. But as what you've probably already notice, my proof seems more complicated than it should be. So I have a feeling that I might have an incorrect use of contradiction. :nb)
Yes, your proof seems more complicated than it needs to be, in part because of the use of lots more letters than are needed.

For example, instead of having coefficients of ##a_i, b_i,## and ##\eta_i##, just use two subscripts. So ##v_i = \sum_{j = 1}^n a_{i~j}u_j##. If you write the equations for ##v_1, v_2, \dots, v_n## in the form of a matrix product, you have ##\vec v = A \vec u##, where A is a matrix with m rows and n columns.I think you can assume without loss of generality that the u vectors are a minimal spanning set; i.e., a basis for the space. Since you need to prove the given statement for every spanning list, it doesn't matter if you prove it for the smallest spanning set.
 
Last edited:
  • Like
Likes   Reactions: Terrell

Similar threads

Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
34
Views
3K
Replies
5
Views
2K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
Replies
10
Views
10K