# Determining Subspaces

1. Feb 5, 2012

### TranscendArcu

1. The problem statement, all variables and given/known data

3. The attempt at a solution
Let $(x,y,z)$ be arbitrary. We write, $(x,y,z) = a(1,0,1) + b(0,1,0) + c(0,1,1)$ for $a,b,c \in R$. From this,

$(x,y,z) = (a,0,a) + (0,b,0) + (0,c,c) = (a,b+c,a+c)$. However, $(a,b+c,a+c)$ can generate all of $R^3$ for appropriately chosen $a,b,c$. Thus, the subspace in question is all of $R^3$.

Am I doing this right?

2. Feb 5, 2012

### tiny-tim

Hi TranscendArcu!

Yes, that's fine.

(another way would be to say that (0,0,1) is the difference of the last two, and (1,0,0) is the difference of that and the first one;

yet another way would be to show that the determinant is non-zero )

3. Feb 5, 2012

### Deveno

another approach is to show that {(1,0,1),(0,1,0),(0,1,1)} is a linearly independent set which would mean that span({(1,0,1),(0,1,0),(0,1,1)}) is a 3-dimensional subspace of R3, hence must be all of R3.

an arbitrary linear combination is, as you noted:

(a,b+c,a+c). if (a,b+c,a+c) = (0,0,0), then:

a = 0
b+c = 0
a+c = 0

a = 0 means that a+c = 0+c, so c = 0, as well. thus b+c = b+0, so b = 0. so if:

(a,b+c,a+c) = a(1,0,1) + b(0,1,0) + c(0,1,1) = (0,0,0), a = b = c = 0, proving linear independence.

it's a good idea to keep all the methods suggested here in mind, because sometimes one way is easier than the others, in terms of ease of calculation.

4. Feb 5, 2012

### TranscendArcu

Does that mean that if V,W, such that W is a subspace of V, are vector spaces, and dimV=dimW, and {A1,...,An} is a basis for W, then span{A1,...,An} = W and V?

5. Feb 5, 2012

### TranscendArcu

This is another problem I'd like to have my work checked on:

If U is a subspace, then certainly it must have a basis consisting of vectors in V. Let $E = \left\{ V_1,...,V_n \right\}$ be a basis for U. Thus, Span(E) = U. Thus,

$U + U = span(E) + span(E) = a_1 V_1 + ... + a_n V_n + a_1 V_1 + ... + a_n V_n = (2a_1)V_1 + ... + (2a_n) V_n$, which is still span(E) = U. (Let the $a_1,...,a_n \in R$)

QED?

6. Feb 5, 2012

### tiny-tim

Yes of course.
This is very complicated :yuck:

Just prove it from the basic definitions of + and vector subspace.

7. Feb 5, 2012

### TranscendArcu

Should I just say: let A be an arbitrary vector in U. Then $A +A \in U$ since U is a subspace and closed. Thus, U + U = U.

8. Feb 5, 2012

### tiny-tim

But {A+A} doesn't give you every element of U + U, does it?

9. Feb 5, 2012

### Deveno

U + U = {u+w: u in U, w in U}.

what i would be tempted to do is show that U and U+U contain each other.

showing U is contained in U+U should be easy (hint: what vector is U guaranteed to have?).

what you have above is only half the story.

10. Feb 5, 2012

### TranscendArcu

Do I need something more like A + B, where A≠B and $A,B \in U$? If so, then do I write, $A + B \in U$ since U is closed, and thus U + U = U? If the problem had stated, U + U + U = U, would I need A + B + C, where A≠B≠C and $A,B,C \in U$?

11. Feb 5, 2012

### TranscendArcu

So, if $A \in U$ then $A \in U + U$ since $\vec0 \in U$ and $A + \vec0 = A$?

12. Feb 5, 2012

### tiny-tim

yes, and yes (to your post # 10)

(but you shouldn't say A≠B, since or course they may be the same! )

13. Feb 5, 2012

### TranscendArcu

Okay, I'll make those adjustments in my work. Thanks! Here's another problem that I'm working:

Let $(a_1,0,0),(a_2,0,0) \in S, x,y \in R$. Then $x(a_1,0,0) + y(a_2,0,0) = (a_1 x,0,0) + (a_2 y,0,0) = (a_1 x +a_2 y,0,0) \in S$. Thus, S is a subspace. Let $(0,b_1,b_2),(0,q_1,q_2) \in T$. Then $x(0,b_1,b_2)+ y(0,q_1,q_2) = (0,x b_1,x b_2) + (0,y q_1,y q_2) = (0,x b_1 + y q_1,x b_2 + y q_2) \in T$. Thus T is a subspace.

I said that S+T will be vectors of the form $\left\{(a,b,b) | a,b \in R \right\}$. Thus, a basis for S+T will be $\left\{(1,0,0),(0,1,1) \right\}$. Is that right?

14. Feb 5, 2012

### Deveno

it appears that in T, the second and third coordinates are always the same. your proof does not take that into account, and so is incorrect.

can you prove that your proposed basis for S+T actually is one? it's not a difficult task, but i see no demonstration of linear independence OR spanning (to be fair, this is not the most challenging example of a sum subspace, but it's a good habit to get into, because you WILL encounter subspaces S+T that aren't "as nice" in the future).

there is another problem (albeit a minor one). you say:

"let (a1,0,0), (a2,0,0) be in S". when you do something like that, it's a good idea to verify that you actually have some things in S. traditionally, this is often done by showing that (0,0,0) lies in S (if it doesn't, your "subspace" doesn't satisfy the vector space axioms).

15. Feb 5, 2012

### TranscendArcu

Okay let me try this again.

Let $(0,b,b),(0,q,q) \in T$. We confirm that T has elements in it by observing that (0,0,0) is an element of T. This is true when b=0. Then $x(0,b,b)+ y(0,q,q) = (0,x b,x b) + (0,y q,y q) = (0,x b + y q,x b + y q) \in T$. Thus T is a subspace.

To prove that $\left\{(1,0,0),(0,1,1) \right\}$ is a basis, we will show first that it is linearly independent. Thus,

$(0,0,0) = a(1,0,0) + b(0,1,1)$. Thus, 0 = 1a, 0=1b, necessitating that a=b=0. Let (x,y,y) be arbitrary in S+T. $(x,y,y) = a(1,0,0) + b(0,1,1)$. Thus, x = 1a = a, y=1b=b. Thus, showing that the basis spans.

16. Feb 5, 2012

### Deveno

exactly so.

now, in this exercise, those things are obvious, and in an informal discussion, wouldn't need any proof at all. with homework...it depends on how picky your instructor is.

but later on, sometimes the "obvious" things aren't so obvious anymore. that's when having "good habits" can at least get you started in the right direction.

17. Feb 5, 2012

### TranscendArcu

This is another problem I'd like checked:

Suppose T is injective. Since $A_1,...,A_n$ is a basis, it is linearly independent. That is, $0A_1+...+0A_n = \vec0$. Thus, $0T(A_1)+...+0T(A_n) = T( \vec0 _w)$. If T is injective, then $KerT = \left\{ 0 \right\}$, which states that the only vector to map to $\vec0 _w$ is $\vec0 _v$. Since the only way to create $\vec0 _w$ from the basis of V is with all zero coefficients, we see that $T(A_1),...,T(A_n)$ is linearly independent.

Suppose that $T(A_1),...,T(A_n)$ are linearly independent. Then $0T(A_1)+...+0T(A_n) = T( \vec0 _w )$. This implies $T(0A_1 + ... 0A_n) = T(\vec0 _v)=\vec0 _w$. Since any change to the coefficients will result in a nonzero element of V going under T, we see that all coefficients must necessarily be zero in order for $T(\vec 0 _v) =\vec0 _w$. Thus, the only thing that creates $\vec0 _w$ is $\vec0 _v$, which implies $kerT = \left\{ 0 \right\}$ and T is injective.

18. Feb 5, 2012

### Deveno

i don't mean to be rude (forgive me if i appear that way, my social skills have deteriorated after decades in isolation), but it occurs to me, you might run afoul of the moderators here by "extending" a thread with multiple questions. i've seen some of them request other posters to start new threads for new topics.

just sayin'

**********

your definition of linear independence doesn't look right. ANY linear combination of a set with 0's as every coefficient will sum to the 0-vector, whether the set is linearly independent or not.

if {a1,...,an} is linearly independent, this means that

c1a1+...+cnan= 0, forces every ci to be 0.

19. Feb 5, 2012

### TranscendArcu

Oh dang. I thought it would be better to keep all of my questions consolidated in one place (both for myself and for others). I can start creating new topics for my questions if that's best. I hope that we can at least address this last question in this thread, though.

Okay. But I don't immediately see how this effects the work I have above. In particular, I thought I addressed the need for all the coefficients to be zero with phrases like "Since the only way to create $\vec0 _w$ from the basis of V is with all zero coefficients" and "Since any change to the coefficients will result in a nonzero element of V going under T, we see that all coefficients must necessarily be zero in order for $T(\vec 0 _v) =\vec0 _w$."

20. Feb 5, 2012

### Deveno

what you are GIVEN is that:

{a1,...,an} is a linearly independent set (because it's a basis) what you have to PROVE is that {T(a1),...,T(an)} is ALSO a linearly independent set, from the fact that T is injective.

now you're good with stating that T injective → ker(T) = {0}

so if c1T(a1)+...+cnT(an) = 0, then

(because T is LINEAR)

T(c1a1+..+cnan) = 0

hence, c1a1+..+cnan is in _______ .

but because T is injective, this means that....? (you want something about what we're taking T OF).

now, what does this tell you about the "c's"? why?

then conclude that:_________ .

in the other direction: 0T(a1) +...+ 0T(an) is ALWAYS going to be 0.

you want to show that a linear combination that sums to 0 has to be the 0-combination, not that a 0-combination is 0 (which is always true).

adding something like "since the only way to get 0 is with all 0 coefficients" is putting the horse after the cart. when proving something, start with what you know to be true, and derive what you want to prove.

*******

i would prove the injectivity of T like this:

since {T(a1),...,T(an)} is _____ _______, if

T(c1a1+...+cnan) = _____,

then (by the linearity of T):

(something goes here).

so that _____ = _____ (the only thing you have to use is the linear independence of a certain set, so this should appear in your proof somewhere).

your goal is to show that ker(T) = 0. so you're going to have to take T of something, just to display a typical element of ker(T). then you need to use your conditions given to show that the only thing that something can be, is the 0-vector of V.