Verifying whether my working is correct in showing Linear Independence

Click For Summary
SUMMARY

The discussion focuses on verifying the linear independence of two sets of vectors: e1=[1 1 0]T, e2=[1 0 1]T, e3=[0 1 1]T and e1=[1 1 0]T, e2=[1 0 -1]T, e3=[0 1 1]T. The method employed involves calculating the determinant of a 3x3 matrix formed by these vectors, where a determinant of zero indicates linear dependence and a non-zero determinant indicates linear independence. The participants confirm that the first set of vectors is independent, while there is confusion regarding the second set, with one participant mistakenly concluding it is dependent.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically linear independence and dependence.
  • Familiarity with matrix operations, including determinant calculation.
  • Knowledge of vector notation and operations in R^3.
  • Basic understanding of systems of linear equations.
NEXT STEPS
  • Study the properties of determinants in linear algebra.
  • Learn how to apply the Rank-Nullity Theorem to determine linear independence.
  • Explore the concept of basis and dimension in vector spaces.
  • Practice solving systems of linear equations using Gaussian elimination.
USEFUL FOR

Students and educators in mathematics, particularly those studying linear algebra, as well as anyone involved in vector space analysis and applications in engineering or physics.

savva
Messages
39
Reaction score
0

Homework Statement


I have attempted the questions below but am not sure if I am applying the method correctly to show linear dependence/independence.

a)Show that the vectors
e1=[1 1 0]T, e2=[1 0 1]T, e3=[0 1 1]T
are linearly independent

b) Show that the vectors
e1=[1 1 0]T, e2=[1 0 -1]T, e3=[0 1 1]T
are linearly independent

(T = Transverse)

Homework Equations


The determinant

The Attempt at a Solution


I have attempted to find the determinant by putting the vectors in a 3x3 matrix and finding the determinant which when =0 should give linear dependence and when ≠0 give linear independence. My working and the questions are attached in a pdf file with this thread.

I'd greatly appreciate any help
 

Attachments

Physics news on Phys.org
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors \{v_1, v_2, \cdot\cdot\cdot, v_n\} is "independent" if and only if in order to have a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}, we must have a_1= a_2= \cdot\cdot\cdot+ a_n= 0.

Here, such a sum would be of the form
a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}
Of course multiplying the scalars and adding that is the same as
\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

which, in turn, is equivalent to the three equations
a_1+ a_2= 0, a_1+ a_3= 0, a_1+ a_2= 0

a_1= a_2= a_3= 0 is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").
 
HallsofIvy said:
Do you know the basic definition of "independent", "dependent" vectors?

A set of vectors \{v_1, v_2, \cdot\cdot\cdot, v_n\} is "independent" if and only if in order to have a_1v_1+ a_2v_2+ \cdot\cdot\cdot+ a_nv_n\}, we must have a_1= a_2= \cdot\cdot\cdot+ a_n= 0.

Here, such a sum would be of the form
a_1\begin{bmatrix}1 \\ 1 \\ 0 \end{bmatrix}+ a_2\begin{bmatrix}1 \\ 0 \\ 1 \end{bmatrix}+ a_3\begin{bmatrix}0 \\ 1 \\ 1\end{bmatrix}= \begin{bmatrix}0\\ 0 \\ 0 \end{bmatrix}
Of course multiplying the scalars and adding that is the same as
\begin{bmatrix}a_1+ a_2 \\ a_1+ a_3 \\a_1+ a_2\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0\end{bmatrix}

which, in turn, is equivalent to the three equations
a_1+ a_2= 0, a_1+ a_3= 0, a_1+ a_2= 0

a_1= a_2= a_3= 0 is obviously a solution to that system of equations. Is it the only one (if so the vectors are independent. If there exist another, non "trivial" solution, they are dependent).

Of course, one can determine whether or not a system of equations is has unique solution by looking at the determinant of coefficients. As you say, these sets of vectors are independent (I would not say "independence exists").

I do not understand by what you mean by "As you say, these sets of vectors are independent (I would not say "independence exists")"
From my calculations I found the first question to be independent and the second dependent. What do you mean you would not say independence exists?

I don't understand how a1+a2=0 or a1+a3=0
if you add these up;
a1+a2= [1 1 0] + [1 0 1] = [2 1 1]
a1+a3= [1 1 0] + [0 1 1] = [1 2 1]

how do you get them to = 0?
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
Replies
17
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
3K
Replies
12
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K