I Characterizing linear independence in terms of span

psie
Messages
315
Reaction score
40
TL;DR Summary
I'm reading Linear Algebra by Friedberg, Insel and Spence. Prior to a theorem, they make a statement about linear independence and they claim this can be deduced from the theorem.
Throughout, let ##\mathsf V## be a vector space (the concept of dimension has not been introduced yet). The statement that precedes the theorem below is that if no proper subset of ##T\subset \mathsf V## generates the span of ##T## (where, if I'm not mistaken, ##T## consists of two or more vectors) then ##T## must be linearly independent. Taking the contrapositive,

Claim: If ##T\subset\mathsf V## is linearly dependent, then there is some proper subset ##S\subset T## such that ##\operatorname{span}(S)=\operatorname{span}(T)##.

Theorem: Let ##S\subset \mathsf V## be linearly independent and let ##v\notin S##. Then ##S\cup\{v\}## is linearly dependent if and only if ##v\in\operatorname{span}(S)##.

I'm trying to prove the claim from the theorem. What I struggle with is that I only seem to be able to prove the claim when ##T=S\cup\{v\}##, where ##S## is linearly independent and ##v\notin S##. Then the theorem tells us that ##v\in\operatorname{span}(S)##. This in turn implies ##\operatorname{span}(S)=\operatorname{span}(S\cup\{v\})## (see below for a proof of why this is implied by ##v\in\operatorname{span}(S)##). Hence we can take the proper subset ##S\subset S\cup\{v\}=T## as the set in the claim preceding the theorem. But what if ##T## is not of the form ##S\cup\{v\}##? I feel like I've only proved a very special case.

Regarding ##v\in\operatorname{span}(S)\implies\operatorname{span}(S)=\operatorname{span}(S\cup\{v\})##, the inclusion ##\operatorname{span}(S)\subset\operatorname{span}(S\cup\{v\})## always holds. The reverse inclusion follows since any ##w\in \operatorname{span}(S\cup\{v\})## can be written as ##w=a_1u_1+\cdots +a_nu_n+bv##, where ##u_1,\ldots,u_n\in S##. Since ##v\in\operatorname{span}(S)##, ##w## is actually a linear combination of vectors in ##S##.
 
Physics news on Phys.org
What definition of linear independence and linear dependence are you using?
 
PeroK said:
What definition of linear independence and linear dependence are you using?
A subset ##S## of a vector space ##\mathsf V## is linearly dependent if there exists a finite number of distinct vectors ##u_1,\ldots,u_n## in ##S## and scalars, ##a_1,\ldots,a_n##, not all zero such that $$a_1u_1+\cdots+a_nu_n=0.$$ A subset ##S## of ##\mathsf V## is linearly independent if it is not linearly dependent.

It was a bit of a mess in post #1, but I think I've figured it out. :smile:
 
The idea is to approach the concept of a basis as a maximal set of linearly independent vectors without requiring that it is finite. Add one non-zero vector to such a set and it isn't linearly independent anymore, which in return automatically generates a linear expression in terms of basis vectors. Note that such a linear expression is always finite. There are no infinite series as we in general do not have a concept of convergence! That would require a topological vector space, e.g. a metric.
 
Thread 'Determine whether ##125## is a unit in ##\mathbb{Z_471}##'
This is the question, I understand the concept, in ##\mathbb{Z_n}## an element is a is a unit if and only if gcd( a,n) =1. My understanding of backwards substitution, ... i have using Euclidean algorithm, ##471 = 3⋅121 + 108## ##121 = 1⋅108 + 13## ##108 =8⋅13+4## ##13=3⋅4+1## ##4=4⋅1+0## using back-substitution, ##1=13-3⋅4## ##=(121-1⋅108)-3(108-8⋅13)## ... ##= 121-(471-3⋅121)-3⋅471+9⋅121+24⋅121-24(471-3⋅121## ##=121-471+3⋅121-3⋅471+9⋅121+24⋅121-24⋅471+72⋅121##...
Back
Top