Proving a set of vectors is linearly independent

In summary: So now that we have both sides equal it's just a matter of cancelling out like terms and we're done! So ##C_1(v_1 \cdot v_1+ \dots + v_n \cdot v_n)+C_2 (v_n \cdot v_1 + \dots + v_1 \cdot v_n)=0##, and thus V is linear independent.
  • #1
BillhB
35
0
Mpost moved from the technical math forums, so no HH Template is shown.
Hey all, student100's brother here, he got me to create an account to ask my question here...

I'm taking an introduction to linear algebra class and we had a test problem to prove a set of vectors is linear independent: ##V = v_1, \dots, v_n \in \Bbb R^n## such that each element of the set is orthogonal to each other, or ## v_1 \cdot (v_2,\dots, v_n)= 0##.

Now I realize I could have just used that to prove that linear independence by adding c's in front of each term and following the logic, but on the test the first thing that came to mind was to suppose another set ##W= v_n, \dots, v_1 \in \Bbb R^n## and then show that $$C_1V+C_2W=0$$ $$(C_1V+C_2W)\cdot V=0\cdot V$$ $$C_1V \cdot V+C_2W \cdot V=0$$ $$C_1V^2+0=0$$
Therefore ##C_1## = 0, I then did the same thing with ##C_2## to show ##C_1=C_2=0## and that V is linear independent. Whats wrong with this proof? The professor said it wasn't general enough, and to be fair didn't take full credit because he said I had the right idea. I just don't understand the "not general enough part." I've never taken a proof based math course, so as you can imagine it's not going hot.

Hopefully this makes sense.

The book we're using if that helps is Anton linear algebra 7th edition.
 
Physics news on Phys.org
  • #2
The notation you are using is unclear.
What are ##C_1## and ##C_2##?
What does ##C_1W## mean? You are multiplying something unknown by a set of vectors ##W##. Multiplication is not defined for sets of vectors.
Similarly, you seem to be taking dot products between sets of vectors ##V## and ##W##, which is undefined.
 
  • #3
andrewkirk said:
The notation you are using is unclear.
What are ##C_1## and ##C_2##?
What does ##C_1W## mean? You are multiplying something unknown by a set of vectors ##W##. Multiplication is not defined for sets of vectors.
Similarly, you seem to be taking dot products between sets of vectors ##V## and ##W##, which is undefined.

Hmm, ##C_n## is just some arbitrary constant. ##C_nW## is that constant multiplied on every element of the set, and it's the same with the dot products, so ##C_2W = C_2(v_n,\dots,v_1)##, likewise ##W \cdot V## would be ##v_n \cdot v_1 + \dots + v_1 \cdot v_n## Sorry for being ambiguous with the notation, I'm just learning what can and cannot be done myself. Thanks for the reply.
 
  • #4
Is what you are trying to prove the following?

Given a set ##V=\{v_1,...,v_n\}## of vectors in ##\mathbb{R}^n## that is pairwise orthogonal, prove that the set is linearly independent.

Also, why does the set ##W## contain the same vectors as the set ##V##, but just listed in reverse order? Sets are not ordered, so that means that ##W## is the same set as ##V##. Is that what you wanted?

A general difficulty is that your proof contains no explanation of the lines written. You need to justify each step, if it is not obvious. I suggest you try to write out your proof more clearly, with explanations of what each item is and justification of each step. It will then be easier to critique and improve it.
 
  • #5
andrewkirk said:
Is what you are trying to prove the following?

Given a set ##V=\{v_1,...,v_n\}## of vectors in ##\mathbb{R}^n## that is pairwise orthogonal, prove that the set is linearly independent.

Also, why does the set ##W## contain the same vectors as the set ##V##, but just listed in reverse order? Sets are not ordered, so that means that ##W## is the same set as ##V##. Is that what you wanted?

A general difficulty is that your proof contains no explanation of the lines written. You need to justify each step, if it is not obvious. I suggest you try to write out your proof more clearly, with explanations of what each item is and justification of each step. It will then be easier to critique and improve it.

Your question is basically the exam question verbatim, and I just made ##W## listed in reverse order to save time on the exam so I could do the operations with the sets, it's the same exact set as ##V##. I didn't realize how much I was abusing notation by doing that.

I'll rewrite my logic with explanations of what I was thinking to help clear things up:

So the set ##V## should be linear independent if the vector equation ##C_1(v_1+ \dots+ v_n)=0##, has only the trivial solution such that ##C_1 = 0##

So, I decided to, using what I hope is proper notation start by saying, suppose: $$C_1(v_1 + \dots + v_n)+C_2(v_n + \dots + v_1) =0$$
Then I decided to take the euclidean dot product on both sides using the elements of ##V##: $$ (C_1(v_1+ \dots +v_n)+ C_2(v_n + \dots + v_1)) \cdot (v_1 + \dots + v_n) = 0 \cdot (v_1 + \dots + v_n)$$
So that when simplified yeilds: $$C_1(v_1 \cdot v_1+ \dots + v_n \cdot v_n)+ C_2 (v_n \cdot v_1 + \dots + v_1 \cdot v_n) = 0$$
Or, since the elements of the set are pairwise orthogonal: $$C_1(v_1^2+ \dots + v_n^2) = 0$$
Or: $$ C_1 = 0$$

I then followed the same sort of logic and found ##C_2 = 0## and ##C_1 = C_2 = 0##, I don't know if this was even needed though for the proof since ##C_1## must be zero and the only solution is the trivial one, so ##V## must be linear independent.

I'm a premed major, so this is the first real math course I've taken. Before it's been more or less just calculating things, which is easy, and I've never really payed much attention to notation or math definitions/theorems before. If I've still butchered this let me know.

His explanation in class was almost the same thing, but he used ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## then took the inner product of each term ##<k_1v_1, (v_1+ \dots + v_n)>## to show each ##k## was zero if I recall correctly.

Is it not general enough because I didn't show that different multiples of each vector might result in a solution other than the trivial one? Or is it just blatantly wrong, and I lucked out he gave me any points for it?
 
Last edited:
  • #6
To start with ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## is correct. Next what does it mean, if ##v_i## and ##v_j## are orthogonal? The product of two vectors is normally written ##<v_i , v_j>##. Use the fact that it is bilinear.
 
  • #7
fresh_42 said:
To start with ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## is correct. Next what does it mean, if ##v_i## and ##v_j## are orthogonal? The product of two vectors is normally written ##<v_i , v_j>##. Use the fact that it is bilinear.

In ##\Bbb R^n## it means they're at right angles, and that the euclidean inner product is 0. I know you can get ##k_1=k_2= \dots =k_n =0## from that to show they're independent, just kind of curious where my own proof breaks down.
 
  • #8
I'm not sure what you mean by ##C_1 (v_1, ... , v_n)##. A matrix? And where have the ##C_2 v^2_i## terms gone? The multiplication is distributive! And why do you want to complicate things with your very own notations and calculations if you already know the short and correct solution?
 
  • #9
fresh_42 said:
I'm not sure what you mean by ##C_1 (v_1, ... , v_n)##. A matrix? And where have the ##C_2 v^2_i## terms gone? The multiplication is distributive! And why do you want to complicate things with your very own notations and calculations if you already know the short and correct solution?

Because I wanted to see where I went wrong, what part was faulty. I don't want to just remember and understand the correct answer, I don't feel like I've learned anything by doing that. Haven't you ever analyzed where you went wrong?

For the first question you asked I meant to use ##C_1(v_1 + \dots + v_n)## so that ##C_1v_1 + C_1v_2 + \dots +C_1v_n##, edited the post to reflect that. ##C_2v_nv_i## terms drop out because ##v_n \cdot v_1 = 0## and likewise for the additional n terms. ##C_1v^2_i=0## terms remain because I wanted to show that only by ##C_1## being zero that the equation contains only the trivial solution.

Not trying to complicate things at all, I made up notation because I don't know what I'm doing. It's my first class were notation is important, and there has been so much notation I'm not quite sure how to use it properly. That's one of the reason I'm trying to understand the fallacies in the argument, to learn.
 
  • #10
BillhB said:
So the set ##V## should be linear independent if the vector equation ##C_1(v_1+ \dots+ v_n)=0##, has only the trivial solution such that ##C_1 = 0##
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## if and only if all the ##C_k## are zero. You need to work with ##n## constants, not only one.
 
  • Like
Likes BillhB
  • #11
andrewkirk said:
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## iff all the ##C_k## are zero. You need to work with ##n## constants, not only one.

Yeah after redoing here that's what I thought. Thanks! Makes a lot more sense to me now.
 
  • #12
I am going to suggest a fresh start.
First try to prove that if a vector ##v## is orthogonal to all the vectors in the set ##u_1,...,u_n## then it is orthogonal to any linear sum of those vectors (start with ##n=2## and then reason to how it must apply to any finite number ##n##).

Then show that if there is a non-trivial (not all coefficients zero) linear sum of the vectors ##v_1,...,v_n## equal to zero, then one of the vectors can be expressed as a linear combination of the others.

Putting those two together, you can then prove that, if the set is not independent, one of the vectors wil be orthogonal to itself, which is easily proved to be impossible for a nonzero vector.
 
  • #13
andrewkirk said:
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## if and only if all the ##C_k## are zero. You need to work with ##n## constants, not only one.
The equation should be ##\sum_{k=1}^n C_k v_k=0##...
 
  • #14
When you are trying to prove something, use the definitions and theorems as much as you can. If you try to make a your own proof, you need to be very clear and critical about every statement you make. It is a skill that may be very frustrating at first, but it eventually becomes second nature. Take a hard look at your original "proof" and I am sure you can find some weaknesses in it. Then take a similar look at the recommended proof. You will see the difference.
 
  • #15
as alliuded to by andrewkirk, the statement you are trying to prove is actually false as stated. i.e. the correct statement is that a set of mutually orthogonal; non zero vectors is independent. e.g. the vectors in the set {0, v} are pairwise orthogonal for any non zero v, but the set is dependent.
 

1. How do you prove that a set of vectors is linearly independent?

To prove that a set of vectors is linearly independent, you need to show that none of the vectors in the set can be written as a linear combination of the other vectors. This means that there is no way to multiply each vector by a scalar and add them together to equal zero.

2. What is the definition of linear independence?

A set of vectors is linearly independent if none of the vectors can be expressed as a linear combination of the others. In other words, no vector in the set is redundant and each vector adds unique information.

3. How do you determine if a set of two vectors is linearly independent?

If you have a set of two vectors, you can use the determinant method to determine linear independence. Set up a determinant with the two vectors as columns. If the determinant is non-zero, then the vectors are linearly independent. If the determinant is zero, then the vectors are linearly dependent.

4. Is the zero vector considered linearly independent?

No, the zero vector is not considered linearly independent. This is because it can always be written as a linear combination of any other vector. For example, the zero vector multiplied by any scalar will still equal the zero vector, making it redundant in a set of vectors.

5. Can a set of linearly dependent vectors be linearly independent?

No, a set of vectors cannot be both linearly dependent and independent. If a set of vectors is linearly dependent, it means that at least one vector can be written as a linear combination of the others. Therefore, it is impossible for the set to also be linearly independent.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
215
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
22
Views
3K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
863
Back
Top