# Subset linearly independent?

1. Jan 11, 2005

### EvLer

I HAVE searched the threads before posting this but I didn't find the same question.
Anyway, the question is T-F:

A subset of linearly dependent set is linearly dependent.

I think it is F, because for non-zero linearly dep. set a proof can be constructed so that some matrices can have non-zero coefficient while the other matrices -- a zero coefficient.
But among those that have a zero coeff. (or if a subset is defined as some elements of zero-coeff. subset and non-zero-coeff.) there may not be a linear dependence necessarily.
Is my proof/reasoning correct?

2. Jan 11, 2005

### NateTG

Can you give an example of a linearly dependant set with only one member?

3. Jan 11, 2005

### EvLer

Yeah, {0}...
So, I guess, you are leading me to the conclusion that the statement is actually true, but isn't this {0} case kind of a 'special' case?
Not always one is going to have a set with {0} as a subset.
I am just starting a linear algebra course, and if my reasoning is off, how would you suggest looking at problems like this?
Calculus was a lot of fun, and I see Linear Algebra is different in way of approach...
Thanks.

Last edited: Jan 11, 2005
4. Jan 11, 2005

### NateTG

Actually you need to figure out whether the question you are asking is about linearly dependant or linearly independant sets of vectors. (The title and question don't match.)

If the question is about linearly dependant sets of vectors then obviously any non-zero vector by itself is a linearly independant subset - so it's false.

I can't really follow your reasoning and don't have any comprehension what you mean by "matrix can have a zero coefficient". I would be inclined to say that you may have the right notion, but you need to express it more rigorously and clearly.

5. Jan 11, 2005

### EvLer

OK,
question is the same (T-F):

A subset of linearly dependent set is linearly dependent.

The reason I posted it as 'IN'dependent is because I think it is independent, but not sure, hence, question mark.
So...hopefully more clear:

Let's say S = {A1, A2, A3, ..., An} set of matrices M(n, m). An element of C is lin.dep. on S if
C = b1A1 + b2A2 + b3A3 + ... + bnAn. (this is straight out of the book)
And let's say for example, b1 = 0, and b2 = 0 holds for this to be true.

Now, let's say G = {A1, A2} (G is subset of S), I think that it is not necessarily the case that there is V such that V is lin.dep. on G, since b1A1 = 0 and b2A2 = 0 in order to prove C lin.dep. on S (above).
Am I off?

Thanks again.

Last edited: Jan 11, 2005
6. Jan 11, 2005

### NateTG

I'm not quite following you.

Let's say we have a set of vectors $V$ then $V$ is saild to be linearly dependant if there is a solution for:
$$v_i=\sum_{j\ne i} a_jv_j$$

That is, if one of the vectors in $V$ can be represented as a sum of the others.

Now, consider the following set of vectors $V=\{v_1,v_2=2v_1\}$ where $v_1 \ne \vec{0}$. Clearly, this is a linearly dependant set of vectors, but $V'=\{v_1\}$ is linearly independant. Therefore the statment is false.

7. Jan 11, 2005

### EvLer

Yes!!!
And that is what I have been trying to say... in a different way...
Thanks a lot.

8. Jan 11, 2005

### HallsofIvy

Staff Emeritus
You were thinking correctly with your "special case" before. Any set of vectors containing the 0 vector is dependent. Now, start with an independent set and append 0 to it. That set is dependent because it contains 0. Can you think of a subset that is not dependent?

9. Jan 12, 2005

### EvLer

Yeah, I guess, I can come up with examples. But what I am wondering about is what NateTG said: V'={v1}, v1 != 0 and V' is independent.
So, if a set V = {A, 0}, where A = [5 5 5] matrix, how do I prove that {A} is linearly independent. For {0} I just kind of took it as a given, but I do not know how this is proved.

Thanks again!

10. Jan 13, 2005

### NateTG

Well, a somewhat better equation that the one I gave would be to say that a non-empty set of vectors $V$ is linearly independant if
$$\vec{0}=\sum_{\vec{v}_i \in V} a_i \vec{v_i} \Rightarrow a_i=0 \forall a_i$$
That is, the only linear combination of the vectors in the set that adds up to the zero vector is all of the vectors multiplied by zero.

Then, for your example, you could solve
[0 0 0]=a [5 5 5]
to show that 'a' must be zero.