Linear Independence: Answer to Homework

It follows from the fact that A^-1 exists and is unique.Okay, I think I see what you are saying. The determinant of the matrix is nonzero, which means there is a unique solution and that solution is k=0. Therefore, the only solution to the system of equations is the trivial solution, indicating that the vectors are linearly independent.In summary, the vectors a+b, b+c, and a+c are linearly independent because the only solution to the system of equations formed by their linear combinations is the trivial solution, as shown by the nonzero determinant of the corresponding matrix. This indicates that the vectors cannot be written as a non-trivial combination of each other, satisfying the definition of linear independence.
  • #1
nuuskur
Science Advisor
909
1,135

Homework Statement


Assume vectors ##a,b,c\in V_{\mathbb{R}}## to be linearly independent. Determine whether vectors ##a+b , b+c , a+c## are linearly independent.

Homework Equations

The Attempt at a Solution


We say the vectors are linearly independent when ##k_1a + k_2b +k_3c = 0## only when every ##k_n = 0## - the only solution is a trivial combination.
Does there exist a non-trivial combination such that
##k_1(a+b) + k_2(b+c) + k_3(a+c) = 0##?. Distributing:
##k_1a + k_1b + k_2b + k_2c + k_3a + k_3c = (k_1 + k_3)a + (k_1+k_2)b + (k_2+k_3)c = 0## Since ##a,b,c## are linearly independent, the only way this result can occur is when:
##k_1+k_3 =0\Rightarrow k_1 = -k_3##
##k_1+k_2 =0##
##k_2+k_3 =0##

Substituting eq 1 into eq 2 we arrive at ##k_2 - k_3 = 0## and according to eq 3 ##k_2 + k_3=0##, which means ##k_2 - k_3 = k_2 + k_3##, therefore ##k_3 = 0##, because ##k=-k## only if ##k=0##. The only solution is a trivial combination, therefore the vectors ##a+b, b+c, a+c## are linearly independent.
 
Last edited:
Physics news on Phys.org
  • #2
nuuskur said:

Homework Statement


Assume vectors ##a,b,c\in V_{\mathbb{R}}## to be linearly independent. Determine whether vectors ##a+b , b+c , a+c## are linearly independent.

Homework Equations

The Attempt at a Solution


We say the vectors are linearly independent when ##k_1a + k_2b +k_3c = 0## only when every ##k_n = 0## - the only solution is a trivial combination.
Does there exist a non-trivial combination such that
##k_1(a+b) + k_2(b+c) + k_3(a+c) = 0##?. Distributing:
##k_1a + k_1b + k_2b + k_2c + k_3a + k_3c = (k_1 + k_3)a + (k_1+k_2)b + (k_2+k_3)c = 0## Since ##a,b,c## are linearly independent, the only way this result can occur is when:
##k_1+k_3 =0\Rightarrow k_1 = -k_3##
##k_1+k_2 =0##
##k_2+k_3 =0##

Substituting eq 1 into eq 2 we arrive at ##k_2 - k_3 = 0## and according to eq 3 ##k_2 + k_3=0##, which means ##k_2 - k_3 = k_2 + k_3##, therefore ##k_3 = 0##, because ##k=-k## only if ##k=0##. The only solution is a trivial combination, therefore the vectors ##a+b, b+c, a+c## are linearly independent.
That works for me. (IOW, I agree that the three new vectors are linearly independent.)

Instead of working with the system of equations, you can set up a matrix and row reduce it. If you end up with the identity matrix, what that says is that ##k_1 = k_2 = k_3 = 0##, and that there are no other solutions.

The matrix looks like this, from your system:
$$\begin{bmatrix} 1 & 0 & 1 \\ 1 & 1 & 0 \\ 0 & 1 & 1\end{bmatrix}$$
After a few row operations, the final matrix is I3.
 
  • #3
Mark44 said:
That works for me. (IOW, I agree that the three new vectors are linearly independent.)

Instead of working with the system of equations, you can set up a matrix and row reduce it. If you end up with the identity matrix, what that says is that ##k_1 = k_2 = k_3 = 0##, and that there are no other solutions.

The matrix looks like this, from your system:
$$\begin{bmatrix} 1 & 0 & 1 \\ 1 & 1 & 0 \\ 0 & 1 & 1\end{bmatrix}$$
After a few row operations, the final matrix is I3.

Alternatively, you can compute the determinant of the matrix to find that it is nonzero. What would that tell you?
 
  • #4
Ray Vickson said:
Alternatively, you can compute the determinant of the matrix to find that it is nonzero. What would that tell you?
Oh. Cramer's rule.
##k_n = \frac{D_{k_n}}{D}## and since the determinant of the system is non zero, the corresponding determinants for every ##k_n## would be 0 (a full column of 0-s means det = 0) and therefore ##k_1 = k_2 = k_3 = 0##
 
  • #5
nuuskur said:
Oh. Cramer's rule.
##k_n = \frac{D_{k_n}}{D}## and since the determinant of the system is non zero, the corresponding determinants for every ##k_n## would be 0 (a full column of 0-s means det = 0) and therefore ##k_1 = k_2 = k_3 = 0##

No, I was not referring to Cramer's rule (which is rarely actually used when solving equations). I was referring to the theorem that if det(A) ≠ 0 then the n ×n system Ak = 0 has k = 0 as its only solution.
 

FAQ: Linear Independence: Answer to Homework

What is linear independence?

Linear independence is a concept in linear algebra that describes the relationship between vectors in a vector space. It means that a set of vectors is linearly independent if none of the vectors can be expressed as a linear combination of the others.

How do you determine if a set of vectors is linearly independent?

To determine if a set of vectors is linearly independent, you can use a few different methods. One way is to create a matrix with the vectors as its columns and perform row operations to see if the matrix reduces to the identity matrix. If it does, the vectors are linearly independent. Another way is to check if the determinant of the matrix formed by the vectors is non-zero. If it is, the vectors are linearly independent.

Why is linear independence important?

Linear independence is important because it allows us to describe and understand vector spaces. It also helps us to solve systems of linear equations and perform operations on matrices. It is a fundamental concept in linear algebra and is used in many areas of science and mathematics.

Can a set of vectors be linearly independent in one vector space but not in another?

Yes, a set of vectors can be linearly independent in one vector space but not in another. This is because the concept of linear independence is relative to the vector space in which the vectors exist. A set of vectors may be linearly independent in one vector space but not in another because of the different dimensions or bases of the vector spaces.

How is linear independence related to linear dependence?

Linear independence and linear dependence are opposite concepts. If a set of vectors is linearly independent, it means that none of the vectors can be expressed as a linear combination of the others. Conversely, if a set of vectors is linearly dependent, it means that at least one vector can be written as a linear combination of the others. In other words, a set of vectors is linearly dependent if it is not linearly independent.

Similar threads

Replies
51
Views
3K
Replies
13
Views
2K
Replies
12
Views
3K
Replies
3
Views
2K
Replies
14
Views
2K
Replies
20
Views
2K
Replies
1
Views
2K
Back
Top