Homework Help: Linear Algebra: Linear Independence

1. Apr 4, 2012

DerivativeofJ

1. The problem statement, all variables and given/known data

Let S be a basis for an n-dimensional vector space V. Show that if v1,v2,...,vr form a linearly independent set of the vectors in V, then the coordinate vectors (v1)s, (v2)s,...,(vr)s form a linearly independent set in the Rn, and conversely.

2. Relevant equations

3. The attempt at a solution

I tried working this problem but i got stuck almost at the end. i know that to show that the coordinate vectors form a linearly independent set that the following equation

k1((v1)s)+ k2((v2)s) +...+ kr((v)s)=0 has to have only the trivial solution. Could i please get some help. I wrote v1, v2,..vn as a linear combination of the set S which i defined as S={w1,w2,...,wn}. Help please.

2. Apr 4, 2012

Dick

If v1 is a vector, then what is (v1)s supposed to mean?

3. Apr 4, 2012

DerivativeofJ

It is notation. It is called the coordinate vector of v1 relative to S.

for example v1 can be written as a linear combination of the basis S

v1= c1(w1)+ c2(w2)+...+ cn(wn)

thus

(v1)s= {c1,c2,...,cn}

4. Apr 5, 2012

Fredrik

Staff Emeritus
You want to prove that the two implications
$$\sum_i k_i v_i=0\ \Rightarrow\ \forall i~~k_i=0$$ and
$$\sum_i k_i (v_i)_S=0\ \Rightarrow\ \forall i~~k_i=0.$$ are either both true or both false. You can do this by proving that $\sum_i k_i (v_i)_S=\big(\sum_i k_i v_i)_S$. This is a matrix equation, so it holds if and only if the jth components of the left-hand side and the right-hand side are equal for all j.

The following observation is useful. For all vectors x, we have
\begin{align} x &=\sum_j x_j w_j\\ x_S &=\sum_j (x_S)_j e_j=\sum_j x_j e_j, \end{align} where the $e_j$ are the standard basis vectors for $\mathbb R^n$. The important detail here is that $(x_S)_j=x_j$, by definition of the "S" notation.

5. Apr 5, 2012

Thank You!