# Shilov's Linear Algebra Subspace Question

1. Mar 7, 2014

### Chacabucogod

Hi,

I'm reading Shilov's linear algebra and in part 2.44 he talks about linear independent vectors in a subspace L which is a subset of space K( he refers to it as K over L). I don't understand why he says that a linear combination of vectors of the subspace L and vectors of the subspace K over L is independent. Is it the same subspace or am I wrong? Shilov also says that the dimension of the subspace K over L n-l. Why?

2. Mar 7, 2014

### micromass

Staff Emeritus
He proves it in the text. He takes a linear independent set $\{f_1,...,f_l\}\subseteq L$ and a set $\{g_1,...,g_k\}\subseteq K$ such that they are linear indepent over L. Then Shilov explicitely shows that

$$\alpha_1 f_1+...+\alpha_lf_l +\beta_1 g_1 +.... + \beta_k g_k = 0$$

implies that $\alpha_1 = ... = \alpha_l = \beta_1 = ...=\beta_k = 0$ which implies linear independence in K.

He explicitely proves this in the book. What about the proof is unclear?

3. Mar 7, 2014

### Chacabucogod

So both f and g are bases of the subspace?

4. Mar 7, 2014

### micromass

Staff Emeritus
The set $\{f_1,...,f_l\}$ is a linear independent set of $L$. And $L$ is assumed to have dimension $l$, so $\{f_1,...,f_l\}$ is indeed a basis.

The $\{g_1,...,g_k\}$ do not (in general) form a basis of anything.

However, we do know that if $k = n-l$ (where $K$ has been assumed to have dimension $n$), then $\{f_1,...,f_l,g_1,...,g_k\}$ is a basis of $K$.

5. Mar 7, 2014

### Chacabucogod

I still don't get it :(. The set of g's are just random vectors, or what? Because he says that vectors g_1....g_k are part of L.

6. Mar 7, 2014

### micromass

Staff Emeritus
He never says that the $g_1,...,g_k$ are elements of $L$.

The $g_1,...,g_k$ are just random elements of $K$. And you only know that they are linearly independent over $L$.

7. Mar 7, 2014

### Chacabucogod

Ok... What does he mean then with:

α1g1+...+αkgk belongs to L

8. Mar 7, 2014

### micromass

Staff Emeritus
He means that if $\alpha_1 g_1 + ... + \alpha_kg_k\in L$, then $\alpha_1 = ...=\alpha_k = 0$.

So the only linear combination of the $\{g_1,...,g_k\}$ belonging to $L$ is the zero vector. And it's not because some linear combination belongs to $L$, that the $g_j$ separately belong to $L$.

9. Mar 8, 2014

### Chacabucogod

Do you have an example? Because I still can't grasp the idea. Thank you

10. Mar 8, 2014

### micromass

Staff Emeritus
Take $K = \mathbb{R}^2$ and $L$ the $x$-axis, thus $L = \mathrm{span}\{(1,0)\}$.

Then $(1,1)$ and $(1,-1)$ are not independent over $L$ because

$$1\cdot (1,1) + 1\cdot (1,-1) = (2,0)\in L$$

but the coefficients are not all $0$.

Consider however $K=\mathbb{R}^3$ and $L = \mathrm\{(1,0,0)\}$ the $x$-axis again. Then $(0,1,0)$ and $(0,0,1)$ are independent over $L$. Indeed, take any linear combination

$$x:=\alpha(0,1,0) + \beta(0,0,1) = (0,\alpha,\beta)$$

if this were in $L$, then there would exist some scalar $c$ such that $x = c(1,0,0)$. But then

$$(c,0,0) = (0,\alpha,\beta)$$

which is clearly only possible if $c=\alpha=\beta=0$.

11. Mar 8, 2014

### Chacabucogod

Ok now I get it. Thanks a lot for taking the time to answer my questions and having the patience.

12. Mar 8, 2014

### micromass

Staff Emeritus
I think that these things will make more sense to you once you get to quotient spaces. Right now, I think the concept is pulled a bit out of the air.