Proving a set of vectors is linearly independent

Click For Summary
The discussion revolves around proving the linear independence of a set of orthogonal vectors in R^n. The original poster attempted a proof by manipulating a reversed set of vectors, but faced criticism for unclear notation and failing to show that all coefficients must be zero in a linear combination. Key points include the need for a proper definition of linear independence, which requires demonstrating that the only solution to the equation involving all vectors is the trivial one. Additionally, it was highlighted that while orthogonal vectors are independent, the presence of the zero vector in a set can lead to dependence despite pairwise orthogonality. The conversation emphasizes the importance of clear notation and logical reasoning in mathematical proofs.
BillhB
Messages
35
Reaction score
0
Mpost moved from the technical math forums, so no HH Template is shown.
Hey all, student100's brother here, he got me to create an account to ask my question here...

I'm taking an introduction to linear algebra class and we had a test problem to prove a set of vectors is linear independent: ##V = v_1, \dots, v_n \in \Bbb R^n## such that each element of the set is orthogonal to each other, or ## v_1 \cdot (v_2,\dots, v_n)= 0##.

Now I realize I could have just used that to prove that linear independence by adding c's in front of each term and following the logic, but on the test the first thing that came to mind was to suppose another set ##W= v_n, \dots, v_1 \in \Bbb R^n## and then show that $$C_1V+C_2W=0$$ $$(C_1V+C_2W)\cdot V=0\cdot V$$ $$C_1V \cdot V+C_2W \cdot V=0$$ $$C_1V^2+0=0$$
Therefore ##C_1## = 0, I then did the same thing with ##C_2## to show ##C_1=C_2=0## and that V is linear independent. Whats wrong with this proof? The professor said it wasn't general enough, and to be fair didn't take full credit because he said I had the right idea. I just don't understand the "not general enough part." I've never taken a proof based math course, so as you can imagine it's not going hot.

Hopefully this makes sense.

The book we're using if that helps is Anton linear algebra 7th edition.
 
Physics news on Phys.org
The notation you are using is unclear.
What are ##C_1## and ##C_2##?
What does ##C_1W## mean? You are multiplying something unknown by a set of vectors ##W##. Multiplication is not defined for sets of vectors.
Similarly, you seem to be taking dot products between sets of vectors ##V## and ##W##, which is undefined.
 
andrewkirk said:
The notation you are using is unclear.
What are ##C_1## and ##C_2##?
What does ##C_1W## mean? You are multiplying something unknown by a set of vectors ##W##. Multiplication is not defined for sets of vectors.
Similarly, you seem to be taking dot products between sets of vectors ##V## and ##W##, which is undefined.

Hmm, ##C_n## is just some arbitrary constant. ##C_nW## is that constant multiplied on every element of the set, and it's the same with the dot products, so ##C_2W = C_2(v_n,\dots,v_1)##, likewise ##W \cdot V## would be ##v_n \cdot v_1 + \dots + v_1 \cdot v_n## Sorry for being ambiguous with the notation, I'm just learning what can and cannot be done myself. Thanks for the reply.
 
Is what you are trying to prove the following?

Given a set ##V=\{v_1,...,v_n\}## of vectors in ##\mathbb{R}^n## that is pairwise orthogonal, prove that the set is linearly independent.

Also, why does the set ##W## contain the same vectors as the set ##V##, but just listed in reverse order? Sets are not ordered, so that means that ##W## is the same set as ##V##. Is that what you wanted?

A general difficulty is that your proof contains no explanation of the lines written. You need to justify each step, if it is not obvious. I suggest you try to write out your proof more clearly, with explanations of what each item is and justification of each step. It will then be easier to critique and improve it.
 
andrewkirk said:
Is what you are trying to prove the following?

Given a set ##V=\{v_1,...,v_n\}## of vectors in ##\mathbb{R}^n## that is pairwise orthogonal, prove that the set is linearly independent.

Also, why does the set ##W## contain the same vectors as the set ##V##, but just listed in reverse order? Sets are not ordered, so that means that ##W## is the same set as ##V##. Is that what you wanted?

A general difficulty is that your proof contains no explanation of the lines written. You need to justify each step, if it is not obvious. I suggest you try to write out your proof more clearly, with explanations of what each item is and justification of each step. It will then be easier to critique and improve it.

Your question is basically the exam question verbatim, and I just made ##W## listed in reverse order to save time on the exam so I could do the operations with the sets, it's the same exact set as ##V##. I didn't realize how much I was abusing notation by doing that.

I'll rewrite my logic with explanations of what I was thinking to help clear things up:

So the set ##V## should be linear independent if the vector equation ##C_1(v_1+ \dots+ v_n)=0##, has only the trivial solution such that ##C_1 = 0##

So, I decided to, using what I hope is proper notation start by saying, suppose: $$C_1(v_1 + \dots + v_n)+C_2(v_n + \dots + v_1) =0$$
Then I decided to take the euclidean dot product on both sides using the elements of ##V##: $$ (C_1(v_1+ \dots +v_n)+ C_2(v_n + \dots + v_1)) \cdot (v_1 + \dots + v_n) = 0 \cdot (v_1 + \dots + v_n)$$
So that when simplified yeilds: $$C_1(v_1 \cdot v_1+ \dots + v_n \cdot v_n)+ C_2 (v_n \cdot v_1 + \dots + v_1 \cdot v_n) = 0$$
Or, since the elements of the set are pairwise orthogonal: $$C_1(v_1^2+ \dots + v_n^2) = 0$$
Or: $$ C_1 = 0$$

I then followed the same sort of logic and found ##C_2 = 0## and ##C_1 = C_2 = 0##, I don't know if this was even needed though for the proof since ##C_1## must be zero and the only solution is the trivial one, so ##V## must be linear independent.

I'm a premed major, so this is the first real math course I've taken. Before it's been more or less just calculating things, which is easy, and I've never really payed much attention to notation or math definitions/theorems before. If I've still butchered this let me know.

His explanation in class was almost the same thing, but he used ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## then took the inner product of each term ##<k_1v_1, (v_1+ \dots + v_n)>## to show each ##k## was zero if I recall correctly.

Is it not general enough because I didn't show that different multiples of each vector might result in a solution other than the trivial one? Or is it just blatantly wrong, and I lucked out he gave me any points for it?
 
Last edited:
To start with ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## is correct. Next what does it mean, if ##v_i## and ##v_j## are orthogonal? The product of two vectors is normally written ##<v_i , v_j>##. Use the fact that it is bilinear.
 
fresh_42 said:
To start with ##k_1v_1 + k_2v_2 + \dots + k_nv_n = 0## is correct. Next what does it mean, if ##v_i## and ##v_j## are orthogonal? The product of two vectors is normally written ##<v_i , v_j>##. Use the fact that it is bilinear.

In ##\Bbb R^n## it means they're at right angles, and that the euclidean inner product is 0. I know you can get ##k_1=k_2= \dots =k_n =0## from that to show they're independent, just kind of curious where my own proof breaks down.
 
I'm not sure what you mean by ##C_1 (v_1, ... , v_n)##. A matrix? And where have the ##C_2 v^2_i## terms gone? The multiplication is distributive! And why do you want to complicate things with your very own notations and calculations if you already know the short and correct solution?
 
fresh_42 said:
I'm not sure what you mean by ##C_1 (v_1, ... , v_n)##. A matrix? And where have the ##C_2 v^2_i## terms gone? The multiplication is distributive! And why do you want to complicate things with your very own notations and calculations if you already know the short and correct solution?

Because I wanted to see where I went wrong, what part was faulty. I don't want to just remember and understand the correct answer, I don't feel like I've learned anything by doing that. Haven't you ever analyzed where you went wrong?

For the first question you asked I meant to use ##C_1(v_1 + \dots + v_n)## so that ##C_1v_1 + C_1v_2 + \dots +C_1v_n##, edited the post to reflect that. ##C_2v_nv_i## terms drop out because ##v_n \cdot v_1 = 0## and likewise for the additional n terms. ##C_1v^2_i=0## terms remain because I wanted to show that only by ##C_1## being zero that the equation contains only the trivial solution.

Not trying to complicate things at all, I made up notation because I don't know what I'm doing. It's my first class were notation is important, and there has been so much notation I'm not quite sure how to use it properly. That's one of the reason I'm trying to understand the fallacies in the argument, to learn.
 
  • #10
BillhB said:
So the set ##V## should be linear independent if the vector equation ##C_1(v_1+ \dots+ v_n)=0##, has only the trivial solution such that ##C_1 = 0##
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## if and only if all the ##C_k## are zero. You need to work with ##n## constants, not only one.
 
  • Like
Likes BillhB
  • #11
andrewkirk said:
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## iff all the ##C_k## are zero. You need to work with ##n## constants, not only one.

Yeah after redoing here that's what I thought. Thanks! Makes a lot more sense to me now.
 
  • #12
I am going to suggest a fresh start.
First try to prove that if a vector ##v## is orthogonal to all the vectors in the set ##u_1,...,u_n## then it is orthogonal to any linear sum of those vectors (start with ##n=2## and then reason to how it must apply to any finite number ##n##).

Then show that if there is a non-trivial (not all coefficients zero) linear sum of the vectors ##v_1,...,v_n## equal to zero, then one of the vectors can be expressed as a linear combination of the others.

Putting those two together, you can then prove that, if the set is not independent, one of the vectors wil be orthogonal to itself, which is easily proved to be impossible for a nonzero vector.
 
  • #13
andrewkirk said:
That is not the correct condition for linear independence. The condition is that ##\sum_{k=1}^n C_n v_n=0## if and only if all the ##C_k## are zero. You need to work with ##n## constants, not only one.
The equation should be ##\sum_{k=1}^n C_k v_k=0##...
 
  • #14
When you are trying to prove something, use the definitions and theorems as much as you can. If you try to make a your own proof, you need to be very clear and critical about every statement you make. It is a skill that may be very frustrating at first, but it eventually becomes second nature. Take a hard look at your original "proof" and I am sure you can find some weaknesses in it. Then take a similar look at the recommended proof. You will see the difference.
 
  • #15
as alliuded to by andrewkirk, the statement you are trying to prove is actually false as stated. i.e. the correct statement is that a set of mutually orthogonal; non zero vectors is independent. e.g. the vectors in the set {0, v} are pairwise orthogonal for any non zero v, but the set is dependent.
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
22
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K