Linearly Dependent Vectors: Exploring Dependence in Vector Spaces

In summary, The conversation is about solving two questions related to linear dependence and row equivalence in vector spaces. The first question is about proving or disproving the linear dependence of three vectors derived from three linearly dependent vectors. The second question is about proving or disproving the linear independence of a set of five vectors if all of its subsets except for the whole set are linearly independent. The conversation also touches on finding bases for row spaces and solving systems of equations.
  • #1
Chen
977
1
I have a couple of questions on this subject that I need help with.

1. Let v1, v2 and v3 be three linearly dependent vectors. Prove or disprove that the following vectors are also linearly dependent:
v1+v2, v1+v3, v2+v3

2. Let S = {v1, v2, v3, v4, v5} be a set of five vectors in a vector space V over a field F. Prove or disprove that if every subset T of S so that T!=S is linearly independent, S is also linearly idependent.
 
Physics news on Phys.org
  • #2
1. Start with the definitions.

2. There probably isn't anything special about 5. Try a simpler problem first.
 
  • #3
Hurkyl said:
1. Start with the definitions.

2. There probably isn't anything special about 5. Try a simpler problem first.
I really don't know what to do with 1. If I did I wouldn't come here and ask about it.

This is a good counter example for the 2nd question, right?
(1, 0, 0, 0)
(0, 1, 0, 0)
(0, 0, 1, 0)
(0, 0, 0, 1)
(1, 1, 1, 1)

Also there's another problem I need help with.
[tex]A = \left(\begin{array}{ccc}1&1&1\\1&1&1\\a&b&c\end{array}\right)[/tex]
Under which conditions for a, b and c are A and At row equivalent?
I tried bringing both matrices to their canonical form (echelon form), but I get lost with all the different cases.
[tex]A^t = \left(\begin{array}{ccc}1&1&a\\1&1&a\\1&1&c\end{array}\right)[/tex]
 
Last edited:
  • #4
1. And I'm telling you what to do... start with the definitions. :tongue:

2. Your counterexample looks good.

3. What do you mean by row equivalent? That you can convert between A and A' by row operations?
 
  • #5
Hurkyl said:
3. What do you mean by row equivalent? That you can convert between A and A' by row operations?
Yup. Our hunch is that a=b=1, and c is free to be whatever it wishes to be, except for a dog or a racoon maybe.
 
Last edited:
  • #6
Hurkyl said:
1. And I'm telling you what to do... start with the definitions. :tongue:
Honestly, that's the first thing I did. Here's what I did and where I got stuck.

We know that:
[tex]\alpha V_1 + \beta V_2 + \gamma V_3 = 0[/tex]
And not all of the scalars are zero.

We need to show that:
[tex]x(V_1 + V_2) + y(V_2 + V_3) + z(V_1 + V_3) = 0[/tex]
[tex](x + z)V_1 + (x + y)V_2 + (y + z)V_3 = 0[/tex]
When not all scalars are zero.

So what I did was create this system of equations:
[tex]\alpha = x + z[/tex]
[tex]\beta = x + y[/tex]
[tex]\gamma = y + z[/tex]
And then I got stuck.

I also tried saying that [tex]V3 = aV_1 + bV_2[/tex] and didn't get much further.
 
  • #7
Ok, so you already figured out the part I was hinting. :tongue: You're just blocking on the easy part -- all you have to do is show whether or not that system of equations has a solution, right?
 
  • #8
3. It might help to look for bases for the row spaces of A and A'.
 
  • #9
Hurkyl said:
Ok, so you already figured out the part I was hinting. :tongue: You're just blocking on the easy part -- all you have to do is show whether or not that system of equations has a solution, right?
Ok, I solved and got:
[tex]x = \frac{\alpha - \beta + \gamma}{2}[/tex]
[tex]y = \frac{-\alpha + \beta + \gamma}{2}[/tex]
[tex]z = \frac{\alpha + \beta - \gamma}{2}[/tex]
And now how do I use the fact that at least one of the scalars is different than zero? :cry:

Oh wait... I'm done, aren't I?

Hurkyl said:
3. It might help to look for bases for the row spaces of A and A'.
Yeah, but we didn't study bases yet, so I can't use it to prove anything in the homework... not that I would know how to find them anyway. :tongue2: What I did try to do is show that the span of the rows of each matrix are equivalent, but that road got too complicated as well.
 
  • #10
1. Sure looks like it!

3. Okay... maybe try starting simple. You know a few vectors in the row space of A... how can you make them out of the rows of A'?
 
  • #11
Hurkyl said:
3. Okay... maybe try starting simple. You know a few vectors in the row space of A... how can you make them out of the rows of A'?
Hmm... Let me think about that a little while (i.e tomorrow when I'm more awake!). :smile:

In the mean time, I solved another question but I'm not sure the solution is correct. Would you be so kind to verify it?

Let A and B be non-zero matrices so that AB = 0. Show that if C is row equivalent to A, then there exists a non-zero matrix D so that CD = 0.

What I did was say that since C is row equivalent to A we can write it as:
[tex]C = E_n...E_2E_1A[/tex]
(where E is an elementary matrix)
So if we multiply both sides of the equations by B we get:
[tex]CB = E_n...E_2E_1AB = E_n...E_2E_1(0) = 0[/tex]
So I found a D... and it even equals B! :tongue: Is this correct?
 
  • #12
4. seems reasonable.
 
  • #13
Hurkyl said:
1. Sure looks like it!

3. Okay... maybe try starting simple. You know a few vectors in the row space of A... how can you make them out of the rows of A'?
Right, I see what you mean. I need to be able to express every row in A as a linear combination of the rows of A', yes?

So for example:
[a b c] = x[1 1 a] + y[1 1 b] + z[1 1 c]
Which means that a=b, right? But what about c?
 
  • #14
Ok so I know that a=b. Now I did it with rows from A' and the row space of A:

[1 1 c] = x[1 1 1] + y[a b c] = x[1 1 1] + y[a a c]

And I got a system of equations:
1 = x + ya
1 = x + ya
c = x + yc

The ranked system matrix is:
[tex]A = \left(\begin{array}{cc|c}1&a&1\\0&c-a&c-1\end{array}\right)[/tex]
Can you please help with that? I need to find when the system to has at least one solution. I think that it's:
c!=1: c!=a
c=1: any a
Am I correct?
 
  • #15
Sounds reasonable.
 
  • #16
Hurkyl said:
Sounds reasonable.
Of course it does, I wouldn't do anything unreasonbale now would I? :approve: I just want to make sure it's also correct.
 

1. What does it mean for vectors to be linearly dependent?

Linear dependence refers to a situation where one or more vectors in a set can be expressed as a linear combination of the other vectors in the set. In other words, one vector can be written as a multiple of another vector or a combination of multiple vectors in the set.

2. How can I determine if a set of vectors is linearly dependent or independent?

To determine if a set of vectors is linearly dependent, you can use the method of Gaussian elimination or row reduction to create an augmented matrix. If the rank of the matrix is less than the number of vectors in the set, the vectors are linearly dependent. Otherwise, if the rank is equal to the number of vectors, they are linearly independent.

3. Can a set of only two vectors be linearly dependent?

Yes, a set of only two vectors can be linearly dependent. This occurs when one vector is a scalar multiple of the other, or when both vectors lie on the same line in space.

4. How are linearly dependent vectors used in real-world applications?

Linearly dependent vectors are commonly used in fields such as physics, engineering, and computer science to represent and solve systems of equations. They are also used in machine learning and data analysis to identify and remove correlated features from a dataset.

5. Is it possible for a set of vectors to be both linearly dependent and independent?

No, it is not possible for a set of vectors to be both linearly dependent and independent. A set of vectors can only be either linearly dependent or independent, but not both at the same time.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Differential Geometry
Replies
16
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
847
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
441
  • Precalculus Mathematics Homework Help
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Precalculus Mathematics Homework Help
Replies
6
Views
1K
Back
Top