Linear Independence to Determining Vectors' Dependence"

  • Thread starter Thread starter Gregg
  • Start date Start date
  • Tags Tags
    Linear
AI Thread Summary
The discussion focuses on determining the linear independence of various sets of vectors given that vectors a, b, and c are linearly independent. The first set, {a, 0}, is dependent because it includes the zero vector. The second set, {a+b, b+c, c+a}, is shown to be independent through a combination of coefficients leading to a unique solution. In contrast, the third set, {a+2b+c, a-b-c, 5a+b-c}, is dependent as demonstrated by a determinant calculation resulting in zero. The importance of understanding the definitions of linear independence and dependence is emphasized, highlighting that the distinction lies in the uniqueness of solutions to the vector equation.
Gregg
Messages
452
Reaction score
0
linear independence

Homework Statement



given that a,b and c are linearly independant vectors determine if the following vectors are linearly independant.

a) a,0

b) a+b, b+c, c+a

c) a+2b+c, a-b-c, 5a+b-c

The Attempt at a Solution



I'm not sure how to tackle the question in this form.

Edit:

a) a=0a Dependant

<br /> \text{Det}\left[\left(<br /> \begin{array}{ccc}<br /> 1 &amp; 0 &amp; 1 \\<br /> 1 &amp; 1 &amp; 0 \\<br /> 0 &amp; 1 &amp; 1<br /> \end{array}<br /> \right)\right]=2

Independant

(c)
<br /> \text{Det}\left[\left(<br /> \begin{array}{ccc}<br /> 1 &amp; 1 &amp; 5 \\<br /> 2 &amp; -1 &amp; 1 \\<br /> 1 &amp; -1 &amp; -1<br /> \end{array}<br /> \right)\right]=0 Dependant

Is it ok to use those vector co-efficients in a matrix like that?
 
Last edited:
Physics news on Phys.org
Where'd you get you those co-efficient matrices from?
What is the definition of vectors being linearly dependent?
 
That actually works. For example, if you row-reduce the matrix in c), you get
\left(\begin{array}{ccc} 1 &amp; 0 &amp; 2 \\ 0 &amp; 1 &amp; 3 \\ 0 &amp; 0 &amp; 0\end{array}\right)

This says that c1 = -2c3, c2 = -3c3, and c3 = c3. If you take c3 = 1, then c1 = -2 and c2 = -3. That linear combination of the vectors given in part c results in a sum of 0, thus demonstrating that the set is linearly dependent. Note spelling of "dependent" Gregg. Similar for independent.
 
Mark44 said:
That actually works. For example, if you Note spelling of "dependent" Gregg. Similar for independent.

whoops
 
I think it is always better to use the basic definitions than try to memorize a specific method without understanding it.

The definition of "dependent" for a set of vectors \left{v_1, v_2, \cdot\cdot\cdot v_n}[/quote] is that there are numbers, a_1, a_2, \cdot\cdot\cdot a_n, <b>not</b> all 0, such that [math]a_1v_1+ a_2v_2+ \cdot\cdot\cdot a_nv_n= 0.

For the first problem, { a, 0}, take a_0= 0, a_1= 1: a_0a+ a_10= 0(a)+ 1(0)= 0.

For (b), with a+b, b+c, c+a, if a_1(a+b)+ a_2(b+ c)+ a_3(c+a)= 0 then (a_1+ a_3)a+ (a_1+ a_2)b+ (a_2+ a_3)c= 0. Since a, b, and c are independent, we must have a_1+ a_3= 0, a_1+ a_2= 0, and a_2+ a_3= 0. Obviously, a_0= a_1= a_2= 0 satisfies that but is it the only solution?
 
I couldn't agree with HallsOfIvy more, in what he said about the importance of understanding definitions as opposed to memorizing a technique without understanding why you are doing it. To often students get tangled up in the details of calculating a determinant or row reducing a matrix without understanding what it means that the matrix determinant is zero or why the matrix should be row reduced.

The definition of linear independence of a set of vectors is stated very simply, but there is a subtlety to it that escapes many students. The only thing that distinguishes a set of linearly independent vectors from a set that is linearly dependent is whether the equation c_1 v_1 + c_2 v_2 + c_3 v_3 + ... + c_n v_n + = 0 has only one solution (independent vectors) or an infinite number of solutions (dependent vectors).
 
I picked up this problem from the Schaum's series book titled "College Mathematics" by Ayres/Schmidt. It is a solved problem in the book. But what surprised me was that the solution to this problem was given in one line without any explanation. I could, therefore, not understand how the given one-line solution was reached. The one-line solution in the book says: The equation is ##x \cos{\omega} +y \sin{\omega} - 5 = 0##, ##\omega## being the parameter. From my side, the only thing I could...
Essentially I just have this problem that I'm stuck on, on a sheet about complex numbers: Show that, for ##|r|<1,## $$1+r\cos(x)+r^2\cos(2x)+r^3\cos(3x)...=\frac{1-r\cos(x)}{1-2r\cos(x)+r^2}$$ My first thought was to express it as a geometric series, where the real part of the sum of the series would be the series you see above: $$1+re^{ix}+r^2e^{2ix}+r^3e^{3ix}...$$ The sum of this series is just: $$\frac{(re^{ix})^n-1}{re^{ix} - 1}$$ I'm having some trouble trying to figure out what to...

Similar threads

Back
Top