Linear independance best methods?

  • Thread starter tamintl
  • Start date
  • #1
74
0
Hi,

I was wondering what the quickest set of vectors in higher dimensions, say ℂ5?

I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?

What are the quickest ways to determine if this is linearly independant? Row echelon takes a long time and it's easy to make mistakes.

Thanks
 

Answers and Replies

  • #2
jambaugh
Science Advisor
Insights Author
Gold Member
2,265
285
You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
[tex]
\left|\begin{array}{ccccc}
a & b & c & d & e \\
1 & 2 & 3 & 4 & 5 \\
2 & 3 & 4 & 5 & 1 \\
3 & 4 & 5 & 1 & 2 \\
4 & 5 & 1 & 2 & 3 \end{array}\right| =

a\left|\begin{array}{cccc}

2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \\
5 & 1 & 2 & 3 \end{array}\right|
- b\left|\begin{array}{cccc}
1 & 3 & 4 & 5 \\
2 & 4 & 5 & 1 \\
3 & 5 & 1 & 2 \\
4 & 1 & 2 & 3 \end{array}\right|
+c\left|\begin{array}{cccc}
1 & 2 & 4 & 5 \\
2 & 3 & 5 & 1 \\
3 & 4 & 1 & 2 \\
4 & 5 & 2 & 3 \end{array}\right|
-d\left|\begin{array}{cccc}
1 & 2 & 3 & 5 \\
2 & 3 & 4 & 1 \\
3 & 4 & 5 & 2 \\
4 & 5 & 1 & 3 \end{array}\right|
+e\left|\begin{array}{cccc}
1 & 2 & 3 & 4 \\
2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \end{array}\right| [/tex]

If this is zero for all a,b,c,d and e your vectors are linearly dependent.... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.
 
  • #3
74
0
You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
[tex]
\left|\begin{array}{ccccc}
a & b & c & d & e \\
1 & 2 & 3 & 4 & 5 \\
2 & 3 & 4 & 5 & 1 \\
3 & 4 & 5 & 1 & 2 \\
4 & 5 & 1 & 2 & 3 \end{array}\right| =

a\left|\begin{array}{cccc}

2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \\
5 & 1 & 2 & 3 \end{array}\right|
- b\left|\begin{array}{cccc}
1 & 3 & 4 & 5 \\
2 & 4 & 5 & 1 \\
3 & 5 & 1 & 2 \\
4 & 1 & 2 & 3 \end{array}\right|
+c\left|\begin{array}{cccc}
1 & 2 & 4 & 5 \\
2 & 3 & 5 & 1 \\
3 & 4 & 1 & 2 \\
4 & 5 & 2 & 3 \end{array}\right|
-d\left|\begin{array}{cccc}
1 & 2 & 3 & 5 \\
2 & 3 & 4 & 1 \\
3 & 4 & 5 & 2 \\
4 & 5 & 1 & 3 \end{array}\right|
+e\left|\begin{array}{cccc}
1 & 2 & 3 & 4 \\
2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \end{array}\right| [/tex]

If this is zero for all a,b,c,d and e your vectors are linearly dependent.... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.
Thanks for your reply.

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks
 
  • #4
mathwonk
Science Advisor
Homework Helper
2020 Award
11,098
1,302
"I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?
"

When I did gaussian elimination I got that they were independent, which I had thought all along. there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.
 
  • #5
jambaugh
Science Advisor
Insights Author
Gold Member
2,265
285
Thanks for your reply.

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks
Nothing quicker in general than what has been stated so far... determinant test or Gaussian elimination. Transposing doesn't change that fact.

[EDIT] ... except of course you have change here what are "the vectors" and thus the question of independence changes. Clearly five 4-dim vectors are linearly dependent!!!
 
Last edited:
  • #6
AlephZero
Science Advisor
Homework Helper
6,994
292
there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.
Indeed there is. Add the vector 5,1,2,3,4 to the set, and you have a http://en.wikipedia.org/wiki/Circulant_matrix

But the OP probably wasn't expected to know that!

In "real life", for large problems like this you would use numerical methods. Algorithms like the QR decomposition and singular value decomposition (SVD) are quicker, and give more useful information about the dependencies, than Gaussian elimination - but again, you won't meet those in a first course on linear algebra!
 
  • #7
mathwonk
Science Advisor
Homework Helper
2020 Award
11,098
1,302
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.
 
  • #8
AlephZero
Science Advisor
Homework Helper
6,994
292
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.
As the Wiki link says, the eigenvectors of a circulant matrix are known without doing any calculations, and each eigenvalue can be found with only ##O(n)## operations, from the terms in one row of the matrix and the n'th roots of unity.

So you can find all the eigenvalues in ##O(n^2)## operations, compared with ##O(n^3)## operations to check that an arbitrary matrix is non-singular by elementary row operations to find the determinant, reduce it to echelon form, or whatever variation on that theme you prefer.

Circulant matrices can be singular of course - for example
0 1 0 1
1 0 1 0
0 1 0 1
1 0 1 0.
 
  • #9
mathwonk
Science Advisor
Homework Helper
2020 Award
11,098
1,302
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me. i thought he wanted something you could see by inspection.

since the corresponding cyclic transformation is given in the usual basis by the companion matrix with columns (for n=3 say): (010), (001), (100), I was hoping there was some way to see how to recognize a cyclic vector for this matrix, but i don't see one.

e.g. there are non cyclic vectors that are not eigenvectors, e.g. this matrix satisfies X^3-1= 0,

but (1,0,-1) is neither cyclic nor an eigenvector, since (1,0,-1), (-1,1,0), )(0,-1,1) are dependent.


(I.e. circulant matrices are by definition made up of columns which form orbits of this action on n space, hence themselves form a n diml space, in which singular ones have codimension one, forming a hypersurface of degree n, being zeroes of the determinant.)

interesting question though.
 
Last edited:
  • #10
AlephZero
Science Advisor
Homework Helper
6,994
292
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me.
Well, I guess that's the difference between a mathematician and a numerical analyst.

It doesn't make much practical difference either way for one set of data of order 5 - any modern computing device would do it in a microsecond, if you can figure out a way to input the data fast enough!

But for real world sized problems, the difference between ##O(n^2)## and ##O(n^3)## might mean getting an answer in seconds rather than days.
 
  • #11
mathwonk
Science Advisor
Homework Helper
2020 Award
11,098
1,302
thank you! for the perspective.
 
  • #12
epenguin
Homework Helper
Gold Member
3,825
857
:uhh: Never very comfortable with linear algebra, but aren't you meant here to use the fact that the difference between most elements in consecutive rows is 1?

Starting from last row, subtracting previous row from the next gjves

1 2 3 4 5
1 1 1 1 -4
1 1 1 -4 1
1 1 -4 1 1

Further obvious subtractions give e.g.

1 2 3 4 5
1 1 1 1 -4
0 0 0 -5 5
0 0 -5 0 5


There is no way to row-combine so as to make the top left submatrix

1 2
1 1

zero, but perhaps 0 × (row 1) + 0 × (row 2) + λ3 × (row 3) + λ4 × (row 4), the λ not both 0, could be 0?

i.e that the horizontal vectors of

0 -5 5
-5 0 5

be linearly dependent, but one easily sees they are not.
 
Last edited:

Related Threads on Linear independance best methods?

  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
13
Views
969
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
2
Views
712
  • Last Post
Replies
7
Views
3K
  • Last Post
Replies
12
Views
3K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
9
Views
2K
  • Last Post
2
Replies
38
Views
5K
Top