How to prove the linear indenpendency of an infinite set

Click For Summary
The discussion centers on the definition of linear independence in the context of infinite sets within vector spaces. It highlights the confusion surrounding the application of traditional definitions when extending to infinite sets, particularly regarding the countability of elements. The clarification provided states that an infinite set is considered linearly independent if all its finite subsets are linearly independent. Additionally, it notes that infinite sums are generally undefined in vector spaces unless there is additional structure, such as topology, that allows for convergence. Understanding these nuances is essential for accurately assessing linear independence in infinite contexts.
julypraise
Messages
104
Reaction score
0
First let me give you some definitions for the clarification of the problem:

Definition (Roman, Linear Algebra)
A nonempty set S of a vector space V with a field F is linearly independent if for any distinct vectors s_{1} , \dots , s_{n} for all a_{1} , \dots , a_{n} if a_{1}u_{1} + \cdots + a_{n}u{n} = 0 then a_{1} = \cdots = a_{n} = 0.

But by this definition, I don't think I can handle an infinite set S (but you may!). When S is infinite, the n up there should be infinite too. Then everything just becomes confusing to me. Can n actually be inifinite? Isn't it an error in the sense of the first order theory? Shouldn't the definition be modifed somewhat to be more rigorous, and to be fitter to deal with the set S in the case where it is infinite? Shouldn't there be any problem of countability of n when S is infinite?

I can't even properly state what actually I don't know. Please give me your helpful comment. Thanks.
 
Physics news on Phys.org
n is finite even when S isn't. The definition of "linearly independent" is better stated like this: A set S\subset V is said to be linearly independent if for all n\in\mathbb N, all s_1,\dots,s_n\in S and all a_1,\dots,a_n\in F, a_1 s_1+\cdots+a_n s_n=0\ \Rightarrow\ a_1=\cdots=a_n=0.
 
Last edited:
If one knows linear independence for finite sets, then you can easily extent it to infinite sets:

An infinite set is linear independent if all its finite subsets are linear independent.
 
one can simply say a set (infinite OR finite) is linearly independent if any finite subset is linearly independent. when the set in question is finite, this, of course, means we must test the entire set for linear independence.

(why just finite subsets? well this has to do with the fact that in a general vector space, infinite sums are undefined. however, if we have "extra structure" on a vector space (such as a topology), then we can define (for example) a notion of convergence of infinite sums, in which case it becomes meaningful to test infinite sets for linear independence. and such things do exist).
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 18 ·
Replies
18
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 40 ·
2
Replies
40
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
22
Views
4K