Verifying whether my working is correct in showing Linear Independence

AI Thread Summary
The discussion centers on verifying the linear independence of two sets of vectors. The method involves calculating the determinant of a matrix formed by the vectors; a non-zero determinant indicates linear independence. The first set of vectors is confirmed to be independent, while the second set is found to be dependent based on the calculations presented. There is confusion regarding the interpretation of equations derived from the vector combinations, particularly how they relate to achieving a zero vector. Clarification is sought on the definitions and implications of linear dependence and independence in vector spaces.
savva
Messages
39
Reaction score
0

Homework Statement


I have attempted the questions below but am not sure if I am applying the method correctly to show linear dependence/independence.

a)Show that the vectors
e1=[1 1 0]T, e2=[1 0 1]T, e3=[0 1 1]T
are linearly independent

b) Show that the vectors
e1=[1 1 0]T, e2=[1 0 -1]T, e3=[0 1 1]T
are linearly independent

(T = Transverse)

Homework Equations


The determinant

The Attempt at a Solution


I have attempted to find the determinant by putting the vectors in a 3x3 matrix and finding the determinant which when =0 should give linear dependence and when ≠0 give linear independence. My working and the questions are attached in a pdf file with this thread.

I'd greatly appreciate any help
 

Attachments

Physics news on Phys.org
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors \{v_1, v_2, \cdot\cdot\cdot, v_n\} is "independent" if and only if in order to have a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}, we must have a_1= a_2= \cdot\cdot\cdot+ a_n= 0.

Here, such a sum would be of the form
a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}
Of course multiplying the scalars and adding that is the same as
\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

which, in turn, is equivalent to the three equations
a_1+ a_2= 0, a_1+ a_3= 0, a_1+ a_2= 0

a_1= a_2= a_3= 0 is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").
 
HallsofIvy said:
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors \{v_1, v_2, \cdot\cdot\cdot, v_n\} is "independent" if and only if in order to have a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}, we must have a_1= a_2= \cdot\cdot\cdot+ a_n= 0.

Here, such a sum would be of the form
a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}
Of course multiplying the scalars and adding that is the same as
\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

which, in turn, is equivalent to the three equations
a_1+ a_2= 0, a_1+ a_3= 0, a_1+ a_2= 0

a_1= a_2= a_3= 0 is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").

I do not understand by what you mean by "As you say, these sets of vectors are independent (I would not say "independence exists")"
From my calculations I found the first question to be independent and the second dependent. What do you mean you would not say independence exists?

I don't understand how a1+a2=0 or a1+a3=0
if you add these up;
a1+a2= [1 1 0] + [1 0 1] = [2 1 1]
a1+a3= [1 1 0] + [0 1 1] = [1 2 1]

how do you get them to = 0?
 
I picked up this problem from the Schaum's series book titled "College Mathematics" by Ayres/Schmidt. It is a solved problem in the book. But what surprised me was that the solution to this problem was given in one line without any explanation. I could, therefore, not understand how the given one-line solution was reached. The one-line solution in the book says: The equation is ##x \cos{\omega} +y \sin{\omega} - 5 = 0##, ##\omega## being the parameter. From my side, the only thing I could...
Back
Top