# Linear Algebra: Linear Independence

## Homework Statement

Let S be a basis for an n-dimensional vector space V. Show that if v1,v2,...,vr form a linearly independent set of the vectors in V, then the coordinate vectors (v1)s, (v2)s,...,(vr)s form a linearly independent set in the Rn, and conversely.

## The Attempt at a Solution

I tried working this problem but i got stuck almost at the end. i know that to show that the coordinate vectors form a linearly independent set that the following equation

k1((v1)s)+ k2((v2)s) +...+ kr((v)s)=0 has to have only the trivial solution. Could i please get some help. I wrote v1, v2,..vn as a linear combination of the set S which i defined as S={w1,w2,...,wn}. Help please.

Dick
Homework Helper
If v1 is a vector, then what is (v1)s supposed to mean?

It is notation. It is called the coordinate vector of v1 relative to S.

for example v1 can be written as a linear combination of the basis S

v1= c1(w1)+ c2(w2)+...+ cn(wn)

thus

(v1)s= {c1,c2,...,cn}

Fredrik
Staff Emeritus
$$\sum_i k_i v_i=0\ \Rightarrow\ \forall i~~k_i=0$$ and
$$\sum_i k_i (v_i)_S=0\ \Rightarrow\ \forall i~~k_i=0.$$ are either both true or both false. You can do this by proving that ##\sum_i k_i (v_i)_S=\big(\sum_i k_i v_i)_S##. This is a matrix equation, so it holds if and only if the jth components of the left-hand side and the right-hand side are equal for all j.
\begin{align} x &=\sum_j x_j w_j\\ x_S &=\sum_j (x_S)_j e_j=\sum_j x_j e_j, \end{align} where the ##e_j## are the standard basis vectors for ##\mathbb R^n##. The important detail here is that ##(x_S)_j=x_j##, by definition of the "S" notation.