Linear Algebra: Linear Independence

Click For Summary
SUMMARY

The discussion centers on proving the linear independence of coordinate vectors derived from a basis in an n-dimensional vector space V. Specifically, if the vectors v1, v2, ..., vr are linearly independent in V, then their corresponding coordinate vectors (v1)s, (v2)s, ..., (vr)s are also linearly independent in Rn, and vice versa. The key to the proof involves demonstrating that the matrix equation relating the linear combinations of the vectors holds true, ensuring that both implications of linear independence are satisfied.

PREREQUISITES
  • Understanding of vector spaces and bases in linear algebra.
  • Familiarity with the concept of linear independence.
  • Knowledge of coordinate vectors and their representation.
  • Basic proficiency in matrix equations and operations.
NEXT STEPS
  • Study the properties of vector spaces and bases in linear algebra.
  • Learn about the implications of linear independence in different contexts.
  • Explore coordinate transformations and their applications in Rn.
  • Review matrix theory, focusing on solving linear equations and their geometric interpretations.
USEFUL FOR

Students and educators in mathematics, particularly those studying linear algebra, as well as anyone involved in theoretical physics or engineering disciplines that utilize vector spaces and linear transformations.

DerivativeofJ
Messages
5
Reaction score
0

Homework Statement



Let S be a basis for an n-dimensional vector space V. Show that if v1,v2,...,vr form a linearly independent set of the vectors in V, then the coordinate vectors (v1)s, (v2)s,...,(vr)s form a linearly independent set in the Rn, and conversely.


Homework Equations





The Attempt at a Solution



I tried working this problem but i got stuck almost at the end. i know that to show that the coordinate vectors form a linearly independent set that the following equation

k1((v1)s)+ k2((v2)s) +...+ kr((v)s)=0 has to have only the trivial solution. Could i please get some help. I wrote v1, v2,..vn as a linear combination of the set S which i defined as S={w1,w2,...,wn}. Help please.
 
Physics news on Phys.org
If v1 is a vector, then what is (v1)s supposed to mean?
 
It is notation. It is called the coordinate vector of v1 relative to S.

for example v1 can be written as a linear combination of the basis S

v1= c1(w1)+ c2(w2)+...+ cn(wn)

thus

(v1)s= {c1,c2,...,cn}
 
You want to prove that the two implications
$$\sum_i k_i v_i=0\ \Rightarrow\ \forall i~~k_i=0$$ and
$$\sum_i k_i (v_i)_S=0\ \Rightarrow\ \forall i~~k_i=0.$$ are either both true or both false. You can do this by proving that ##\sum_i k_i (v_i)_S=\big(\sum_i k_i v_i)_S##. This is a matrix equation, so it holds if and only if the jth components of the left-hand side and the right-hand side are equal for all j.

The following observation is useful. For all vectors x, we have
$$
\begin{align}
x &=\sum_j x_j w_j\\
x_S &=\sum_j (x_S)_j e_j=\sum_j x_j e_j,
\end{align}
$$ where the ##e_j## are the standard basis vectors for ##\mathbb R^n##. The important detail here is that ##(x_S)_j=x_j##, by definition of the "S" notation.
 
Thank You!
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
1K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K