# Linear independance best methods?

Hi,

I was wondering what the quickest set of vectors in higher dimensions, say ℂ5?

I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?

What are the quickest ways to determine if this is linearly independant? Row echelon takes a long time and it's easy to make mistakes.

Thanks

jambaugh
Gold Member
You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
$$\left|\begin{array}{ccccc} a & b & c & d & e \\ 1 & 2 & 3 & 4 & 5 \\ 2 & 3 & 4 & 5 & 1 \\ 3 & 4 & 5 & 1 & 2 \\ 4 & 5 & 1 & 2 & 3 \end{array}\right| = a\left|\begin{array}{cccc} 2 & 3 & 4 & 5 \\ 3 & 4 & 5 & 1 \\ 4 & 5 & 1 & 2 \\ 5 & 1 & 2 & 3 \end{array}\right| - b\left|\begin{array}{cccc} 1 & 3 & 4 & 5 \\ 2 & 4 & 5 & 1 \\ 3 & 5 & 1 & 2 \\ 4 & 1 & 2 & 3 \end{array}\right| +c\left|\begin{array}{cccc} 1 & 2 & 4 & 5 \\ 2 & 3 & 5 & 1 \\ 3 & 4 & 1 & 2 \\ 4 & 5 & 2 & 3 \end{array}\right| -d\left|\begin{array}{cccc} 1 & 2 & 3 & 5 \\ 2 & 3 & 4 & 1 \\ 3 & 4 & 5 & 2 \\ 4 & 5 & 1 & 3 \end{array}\right| +e\left|\begin{array}{cccc} 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 5 \\ 3 & 4 & 5 & 1 \\ 4 & 5 & 1 & 2 \end{array}\right|$$

If this is zero for all a,b,c,d and e your vectors are linearly dependent.... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.

You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
$$\left|\begin{array}{ccccc} a & b & c & d & e \\ 1 & 2 & 3 & 4 & 5 \\ 2 & 3 & 4 & 5 & 1 \\ 3 & 4 & 5 & 1 & 2 \\ 4 & 5 & 1 & 2 & 3 \end{array}\right| = a\left|\begin{array}{cccc} 2 & 3 & 4 & 5 \\ 3 & 4 & 5 & 1 \\ 4 & 5 & 1 & 2 \\ 5 & 1 & 2 & 3 \end{array}\right| - b\left|\begin{array}{cccc} 1 & 3 & 4 & 5 \\ 2 & 4 & 5 & 1 \\ 3 & 5 & 1 & 2 \\ 4 & 1 & 2 & 3 \end{array}\right| +c\left|\begin{array}{cccc} 1 & 2 & 4 & 5 \\ 2 & 3 & 5 & 1 \\ 3 & 4 & 1 & 2 \\ 4 & 5 & 2 & 3 \end{array}\right| -d\left|\begin{array}{cccc} 1 & 2 & 3 & 5 \\ 2 & 3 & 4 & 1 \\ 3 & 4 & 5 & 2 \\ 4 & 5 & 1 & 3 \end{array}\right| +e\left|\begin{array}{cccc} 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 5 \\ 3 & 4 & 5 & 1 \\ 4 & 5 & 1 & 2 \end{array}\right|$$

If this is zero for all a,b,c,d and e your vectors are linearly dependent.... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks

mathwonk
Homework Helper
"I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?
"

When I did gaussian elimination I got that they were independent, which I had thought all along. there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.

jambaugh
Gold Member

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks

Nothing quicker in general than what has been stated so far... determinant test or Gaussian elimination. Transposing doesn't change that fact.

[EDIT] ... except of course you have change here what are "the vectors" and thus the question of independence changes. Clearly five 4-dim vectors are linearly dependent!!!

Last edited:
AlephZero
Homework Helper
there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.

Indeed there is. Add the vector 5,1,2,3,4 to the set, and you have a http://en.wikipedia.org/wiki/Circulant_matrix

But the OP probably wasn't expected to know that!

In "real life", for large problems like this you would use numerical methods. Algorithms like the QR decomposition and singular value decomposition (SVD) are quicker, and give more useful information about the dependencies, than Gaussian elimination - but again, you won't meet those in a first course on linear algebra!

mathwonk
Homework Helper
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.

AlephZero
Homework Helper
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.

As the Wiki link says, the eigenvectors of a circulant matrix are known without doing any calculations, and each eigenvalue can be found with only ##O(n)## operations, from the terms in one row of the matrix and the n'th roots of unity.

So you can find all the eigenvalues in ##O(n^2)## operations, compared with ##O(n^3)## operations to check that an arbitrary matrix is non-singular by elementary row operations to find the determinant, reduce it to echelon form, or whatever variation on that theme you prefer.

Circulant matrices can be singular of course - for example
0 1 0 1
1 0 1 0
0 1 0 1
1 0 1 0.

mathwonk
Homework Helper
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me. i thought he wanted something you could see by inspection.

since the corresponding cyclic transformation is given in the usual basis by the companion matrix with columns (for n=3 say): (010), (001), (100), I was hoping there was some way to see how to recognize a cyclic vector for this matrix, but i don't see one.

e.g. there are non cyclic vectors that are not eigenvectors, e.g. this matrix satisfies X^3-1= 0,

but (1,0,-1) is neither cyclic nor an eigenvector, since (1,0,-1), (-1,1,0), )(0,-1,1) are dependent.

(I.e. circulant matrices are by definition made up of columns which form orbits of this action on n space, hence themselves form a n diml space, in which singular ones have codimension one, forming a hypersurface of degree n, being zeroes of the determinant.)

interesting question though.

Last edited:
AlephZero
Homework Helper
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me.

Well, I guess that's the difference between a mathematician and a numerical analyst.

It doesn't make much practical difference either way for one set of data of order 5 - any modern computing device would do it in a microsecond, if you can figure out a way to input the data fast enough!

But for real world sized problems, the difference between ##O(n^2)## and ##O(n^3)## might mean getting an answer in seconds rather than days.

mathwonk
Homework Helper
thank you! for the perspective.

epenguin
Homework Helper
Gold Member
:uhh: Never very comfortable with linear algebra, but aren't you meant here to use the fact that the difference between most elements in consecutive rows is 1?

Starting from last row, subtracting previous row from the next gjves

1 2 3 4 5
1 1 1 1 -4
1 1 1 -4 1
1 1 -4 1 1

Further obvious subtractions give e.g.

1 2 3 4 5
1 1 1 1 -4
0 0 0 -5 5
0 0 -5 0 5

There is no way to row-combine so as to make the top left submatrix

1 2
1 1

zero, but perhaps 0 × (row 1) + 0 × (row 2) + λ3 × (row 3) + λ4 × (row 4), the λ not both 0, could be 0?

i.e that the horizontal vectors of

0 -5 5
-5 0 5

be linearly dependent, but one easily sees they are not.

Last edited: