# Linear Algebra Proofs

1. May 30, 2006

### EbolaPox

Hello. I'm self-studying Linear Algebra and I'm thoroughly enjoying the subject of Vector Spaces. While reading through the text, I came upon a theorem that states
"Let $$S_1$$ and $$S_2$$ be finite subsets of a vecotr space and let $$S_1$$ be a subset of $$S_2$$.
Then
If $$S_1$$ is linearly dependent then so is $$S_2$$.

My exposure to proofs is minimal, save for what I've done in Apostol's Calculus and my Differential Equations class. I just graduated from High School and we weren't expected to do any proofs in my classes, so I decided it would be good to start learning with this linear algebra. I'd just like to see if my proof would be satisfactory. If there are any logical errors or anything, any suggestions would be great. Thank you

Proof:
I started with the definition of Linear Dependence. If S = {a_1, a_2, a_3,... a_k} is a set of vectors in a space V, S is said to be linearly dependent if
$\sum_{i=1}^k c_i a_i = 0$ Where c is a scalar. At least one c must not be zero.

So, I defined my two sets $$S_1 = {a_1 , a _2 , .... a_k} S_2 = {a_1, a_2, ....a_n}$$ with n greater than k. (So that S_2 is a larger set).

So, I know that
$\sum_{i=1}^k c_i a_i = 0.$ This is merely a statement that S_1 is linearly dependent.
Now, To test for linear dependence of S_2

$\sum_{i=1}^n c_i a_i = \sum_{i=1}^k c_i a_i + \sum_{i=k+1}^n c_i a_i$. (I broke up the sum. for the first set of terms up to k then from k+1 up to n.) Now, we know that the first sum from 1 to k is just the first set which is necessarily linearly dependent. The sum $$\sum_{i=k+1}^n c_i a_i$$ can be zero because I can claim that all c_i from k+1 to n are zero. Therefore, S_2 is linearly dependent, thus proving the original statement.

Is this way too wordy, is there an easier way, is this correct, and I certainly hope that my Latex notation comes out right...
Also, I may have others, but I wanted to make sure I could do this basic one first.

Last edited: May 30, 2006
2. May 30, 2006

### Hurkyl

Staff Emeritus
Some little corrections -- I'm not sure if they're conceptual errors, or just by-products of inexperience. (probably the latter) But otherwise, you seem to have the right idea.

Not quite -- S is linearly independent if there exists $c_i \quad (1 \leq i \leq k)$ such that $\sum_{i=1}^k c_i a_i = 0.$, c is a scalar, and at least one c is not zero.

You can't do that! In order to prove this theorem, you have to treat $S_1$ and $S_2$ as givens: you don't get to define them.

But when you're given $S_1$ and $S_2$, what you can do is to define the $a_i$, k, and n so that the quoted passage holds.

And, BTW, you can't assume n>k; your two sets might actually be equal, so that n=k!

You're not claiming that they are zero -- you are choosing them to be zero.

Looks good. One trick is to use the [ itex ] tag instead of [ tex ] for expressions in paragraphs: it renders them smaller so it typesets more nicely.

Last edited: May 30, 2006
3. May 30, 2006

### EbolaPox

Thank you very much for your reply. Yes, with respect to the statement for linear dependence, that was an error on my part. Your way of phrasing it is superior.

What I'm wondering about is what I did with my sets. Since I was told that
$$S_1 \subset S_2$$, I rather assumed that S_1 was a proper subset of S_2 and thus not equal. I thought about the possibility of $$S_1 = S_2$$, but wasn't sure if that was right or not. That's good to know that I should consider n => k. However, when I defined my sets, I'm not too sure what I did that's incorrect there. What should I have done instead of defining them like I did, as what you say makes sense: I should only use what is given.

Thank you very much for your suggestions : )

4. May 30, 2006

### Hurkyl

Staff Emeritus
Some authors use $\subset$ for proper subsets, and $\subseteq$ for any subset.

Some authors use $\subsetneq$ for proper subsets, and $\subset$ for any subset.

It's irritating that there are two different conventions. I guess your Linear Algebra book is using the latter, but I really prefer the former.

Happily, some authors are absolutely clear and use $\subsetneq$ for proper subsets, and $\subseteq$ for any subset.

What you wanted to be doing there is defining the symbols $a_i$ to be the elements of the two sets you're given. I guess it might be a little confusing that:

"Let S = {a, b, c}"

is used in at least three different manners:

(1) You are given a set S of three elements, and this expression is defining a, b, and c to be the equal to the three elements of S.

(2) You are given a, b, and c, and this expression is defining the set S to be the one containing a, b, and c.

(3) You are defining S, a, b, and c.

but you can work it out from context: that sentence is defining whatever has not yet been defined in terms of what has already been defined!

I probably wouldn't have said anything about that particular passage if you hadn't specifically said you were using that expression to define your two sets.

Last edited: May 30, 2006
5. May 30, 2006

### EbolaPox

In the definition of Linear Dependence, the author of the text I'm reading from (Linear Algebra , Bernard Kolman 1970 Second Edition) , the author didn't use any subset notation nor sigma summation notation. That's also one of the reason I wantedt os ee if my proof was correct because I wanted to make sure I used my summation notation right. The author introduces the concept of proper set at the beggining, but did not show a symbol for it. I saw it in Apostol that $$\subset$$ generally means proper subset. I've never seen the latter two. It's good to be aware of these notational differences, thanks!

Also: would it be wise to invest in a Set Theory book? The only sets I've really worked with have been what I've read in Apostol and the intro chapter to this Linear Algebra text.