EbolaPox
- 99
- 1
Hello. I'm self-studying Linear Algebra and I'm thoroughly enjoying the subject of Vector Spaces. While reading through the text, I came upon a theorem that states
"Let S_1 and S_2 be finite subsets of a vecotr space and let S_1 be a subset of S_2.
Then
If S_1 is linearly dependent then so is S_2.
My exposure to proofs is minimal, save for what I've done in Apostol's Calculus and my Differential Equations class. I just graduated from High School and we weren't expected to do any proofs in my classes, so I decided it would be good to start learning with this linear algebra. I'd just like to see if my proof would be satisfactory. If there are any logical errors or anything, any suggestions would be great. Thank you
Proof:
I started with the definition of Linear Dependence. If S = {a_1, a_2, a_3,... a_k} is a set of vectors in a space V, S is said to be linearly dependent if
\sum_{i=1}^k c_i a_i = 0<br /> Where c is a scalar. At least one c must not be zero.
So, I defined my two sets S_1 = {a_1 , a _2 , ... a_k} S_2 = {a_1, a_2, ...a_n} with n greater than k. (So that S_2 is a larger set).
So, I know that
\sum_{i=1}^k c_i a_i = 0. This is merely a statement that S_1 is linearly dependent.
Now, To test for linear dependence of S_2
\sum_{i=1}^n c_i a_i = \sum_{i=1}^k c_i a_i + \sum_{i=k+1}^n c_i a_i. (I broke up the sum. for the first set of terms up to k then from k+1 up to n.) Now, we know that the first sum from 1 to k is just the first set which is necessarily linearly dependent. The sum \sum_{i=k+1}^n c_i a_i can be zero because I can claim that all c_i from k+1 to n are zero. Therefore, S_2 is linearly dependent, thus proving the original statement.
Is this way too wordy, is there an easier way, is this correct, and I certainly hope that my Latex notation comes out right...
Also, I may have others, but I wanted to make sure I could do this basic one first.
"Let S_1 and S_2 be finite subsets of a vecotr space and let S_1 be a subset of S_2.
Then
If S_1 is linearly dependent then so is S_2.
My exposure to proofs is minimal, save for what I've done in Apostol's Calculus and my Differential Equations class. I just graduated from High School and we weren't expected to do any proofs in my classes, so I decided it would be good to start learning with this linear algebra. I'd just like to see if my proof would be satisfactory. If there are any logical errors or anything, any suggestions would be great. Thank you
Proof:
I started with the definition of Linear Dependence. If S = {a_1, a_2, a_3,... a_k} is a set of vectors in a space V, S is said to be linearly dependent if
\sum_{i=1}^k c_i a_i = 0<br /> Where c is a scalar. At least one c must not be zero.
So, I defined my two sets S_1 = {a_1 , a _2 , ... a_k} S_2 = {a_1, a_2, ...a_n} with n greater than k. (So that S_2 is a larger set).
So, I know that
\sum_{i=1}^k c_i a_i = 0. This is merely a statement that S_1 is linearly dependent.
Now, To test for linear dependence of S_2
\sum_{i=1}^n c_i a_i = \sum_{i=1}^k c_i a_i + \sum_{i=k+1}^n c_i a_i. (I broke up the sum. for the first set of terms up to k then from k+1 up to n.) Now, we know that the first sum from 1 to k is just the first set which is necessarily linearly dependent. The sum \sum_{i=k+1}^n c_i a_i can be zero because I can claim that all c_i from k+1 to n are zero. Therefore, S_2 is linearly dependent, thus proving the original statement.
Is this way too wordy, is there an easier way, is this correct, and I certainly hope that my Latex notation comes out right...
Also, I may have others, but I wanted to make sure I could do this basic one first.
Last edited: