Linear independance best methods?

  • Thread starter tamintl
  • Start date
  • Tags
    Linear
In summary: I don't see that this is any faster than the general determinant formula... it's still the same thing... just the numbers are in a different place... can you explain further?Sure. A circulant matrix is a special case of a toeplitz matrix, and has a particularly simple determinant formula. See the Wikipedia article I linked to.
  • #1
tamintl
74
0
Hi,

I was wondering what the quickest set of vectors in higher dimensions, say ℂ5?

I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?

What are the quickest ways to determine if this is linearly independant? Row echelon takes a long time and it's easy to make mistakes.

Thanks
 
Physics news on Phys.org
  • #2
You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
[tex]
\left|\begin{array}{ccccc}
a & b & c & d & e \\
1 & 2 & 3 & 4 & 5 \\
2 & 3 & 4 & 5 & 1 \\
3 & 4 & 5 & 1 & 2 \\
4 & 5 & 1 & 2 & 3 \end{array}\right| =

a\left|\begin{array}{cccc}

2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \\
5 & 1 & 2 & 3 \end{array}\right|
- b\left|\begin{array}{cccc}
1 & 3 & 4 & 5 \\
2 & 4 & 5 & 1 \\
3 & 5 & 1 & 2 \\
4 & 1 & 2 & 3 \end{array}\right|
+c\left|\begin{array}{cccc}
1 & 2 & 4 & 5 \\
2 & 3 & 5 & 1 \\
3 & 4 & 1 & 2 \\
4 & 5 & 2 & 3 \end{array}\right|
-d\left|\begin{array}{cccc}
1 & 2 & 3 & 5 \\
2 & 3 & 4 & 1 \\
3 & 4 & 5 & 2 \\
4 & 5 & 1 & 3 \end{array}\right|
+e\left|\begin{array}{cccc}
1 & 2 & 3 & 4 \\
2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \end{array}\right| [/tex]

If this is zero for all a,b,c,d and e your vectors are linearly dependent... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.
 
  • #3
jambaugh said:
You don't have as many vectors here as dimension so you can add an auxiliary vector with variables for components, and then take a determinant. In particular you would expand (cofactor expansion of determinants) your determinant about this variable vector row so you get, in this case 5 sub-deteriminants. They would all have to be zero if the vectors are linearly dependent. (It is the analog of taking cross product of two vectors, if they're parallel the cross product is the zero vector).

So here:
[tex]
\left|\begin{array}{ccccc}
a & b & c & d & e \\
1 & 2 & 3 & 4 & 5 \\
2 & 3 & 4 & 5 & 1 \\
3 & 4 & 5 & 1 & 2 \\
4 & 5 & 1 & 2 & 3 \end{array}\right| =

a\left|\begin{array}{cccc}

2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \\
5 & 1 & 2 & 3 \end{array}\right|
- b\left|\begin{array}{cccc}
1 & 3 & 4 & 5 \\
2 & 4 & 5 & 1 \\
3 & 5 & 1 & 2 \\
4 & 1 & 2 & 3 \end{array}\right|
+c\left|\begin{array}{cccc}
1 & 2 & 4 & 5 \\
2 & 3 & 5 & 1 \\
3 & 4 & 1 & 2 \\
4 & 5 & 2 & 3 \end{array}\right|
-d\left|\begin{array}{cccc}
1 & 2 & 3 & 5 \\
2 & 3 & 4 & 1 \\
3 & 4 & 5 & 2 \\
4 & 5 & 1 & 3 \end{array}\right|
+e\left|\begin{array}{cccc}
1 & 2 & 3 & 4 \\
2 & 3 & 4 & 5 \\
3 & 4 & 5 & 1 \\
4 & 5 & 1 & 2 \end{array}\right| [/tex]

If this is zero for all a,b,c,d and e your vectors are linearly dependent... as soon as you see a non-zero deterimant you know you have linear independence. Is this faster and less error prone? I'm not sure and it may be a matter of cases... the problem is intrinsically complex.

Thanks for your reply.

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks
 
  • #4
"I have a set of vectors { (1,2,3,4,5), (2,3,4,5,1), (3,4,5,1,2), (4,5,1,2,3) }

Clearly this is linearly dependant but how could I justify this quickly?
"

When I did gaussian elimination I got that they were independent, which I had thought all along. there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.
 
  • #5
tamintl said:
Thanks for your reply.

If I were to take the transpose so it becomes:

({1,2,3,4},{2,3,4,5},{3,4,5,1},{4,5,1,2}, {5,1,2,3})

Would you see any quick way to determine linear independance?

Many thanks

Nothing quicker in general than what has been stated so far... determinant test or Gaussian elimination. Transposing doesn't change that fact.

[EDIT] ... except of course you have change here what are "the vectors" and thus the question of independence changes. Clearly five 4-dim vectors are linearly dependent!
 
Last edited:
  • #6
mathwonk said:
there is probably a quick way to see this, if correct, given the simple "cyclic" rule of formation.

Indeed there is. Add the vector 5,1,2,3,4 to the set, and you have a http://en.wikipedia.org/wiki/Circulant_matrix

But the OP probably wasn't expected to know that!

In "real life", for large problems like this you would use numerical methods. Algorithms like the QR decomposition and singular value decomposition (SVD) are quicker, and give more useful information about the dependencies, than Gaussian elimination - but again, you won't meet those in a first course on linear algebra!
 
  • #7
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.
 
  • #8
mathwonk said:
aleph, how does knowing it is a circulant matrix help? are you plugging into the determinant formula for a circulant matrix, and arguing that it is not zero, because it has degree 4 and differs from the 4th cyclotomic polynomial? that doesn't seem especially quick or intuitive, but maybe it is if you are very used to these.

As the Wiki link says, the eigenvectors of a circulant matrix are known without doing any calculations, and each eigenvalue can be found with only ##O(n)## operations, from the terms in one row of the matrix and the n'th roots of unity.

So you can find all the eigenvalues in ##O(n^2)## operations, compared with ##O(n^3)## operations to check that an arbitrary matrix is non-singular by elementary row operations to find the determinant, reduce it to echelon form, or whatever variation on that theme you prefer.

Circulant matrices can be singular of course - for example
0 1 0 1
1 0 1 0
0 1 0 1
1 0 1 0.
 
  • #9
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me. i thought he wanted something you could see by inspection.

since the corresponding cyclic transformation is given in the usual basis by the companion matrix with columns (for n=3 say): (010), (001), (100), I was hoping there was some way to see how to recognize a cyclic vector for this matrix, but i don't see one.

e.g. there are non cyclic vectors that are not eigenvectors, e.g. this matrix satisfies X^3-1= 0,

but (1,0,-1) is neither cyclic nor an eigenvector, since (1,0,-1), (-1,1,0), )(0,-1,1) are dependent.


(I.e. circulant matrices are by definition made up of columns which form orbits of this action on n space, hence themselves form a n diml space, in which singular ones have codimension one, forming a hypersurface of degree n, being zeroes of the determinant.)

interesting question though.
 
Last edited:
  • #10
mathwonk said:
"So you can find all the eigenvalues in O(n^2) operations,"

that is nice, but it doesn't seem especially "quick" to me.

Well, I guess that's the difference between a mathematician and a numerical analyst.

It doesn't make much practical difference either way for one set of data of order 5 - any modern computing device would do it in a microsecond, if you can figure out a way to input the data fast enough!

But for real world sized problems, the difference between ##O(n^2)## and ##O(n^3)## might mean getting an answer in seconds rather than days.
 
  • #11
thank you! for the perspective.
 
  • #12
:uhh: Never very comfortable with linear algebra, but aren't you meant here to use the fact that the difference between most elements in consecutive rows is 1?

Starting from last row, subtracting previous row from the next gjves

1 2 3 4 5
1 1 1 1 -4
1 1 1 -4 1
1 1 -4 1 1

Further obvious subtractions give e.g.

1 2 3 4 5
1 1 1 1 -4
0 0 0 -5 5
0 0 -5 0 5There is no way to row-combine so as to make the top left submatrix

1 2
1 1

zero, but perhaps 0 × (row 1) + 0 × (row 2) + λ3 × (row 3) + λ4 × (row 4), the λ not both 0, could be 0?

i.e that the horizontal vectors of

0 -5 5
-5 0 5

be linearly dependent, but one easily sees they are not.
 
Last edited:

1. What is linear independence and why is it important in scientific research?

Linear independence is a concept in linear algebra that refers to a set of vectors being independent of each other, meaning that no vector in the set can be written as a linear combination of the other vectors. It is important in scientific research because it allows for the creation of accurate mathematical models and the ability to analyze complex data sets efficiently.

2. What are the best methods for determining linear independence?

The best methods for determining linear independence include calculating the determinant of a matrix, checking for linearly dependent rows or columns, and using the Gram-Schmidt process to transform a set of vectors into an orthogonal set.

3. How can linear independence be applied in real-world scientific problems?

Linear independence can be applied in various scientific problems, such as data analysis, machine learning, and optimization. For example, in machine learning, linearly independent features are used to create accurate models for predicting outcomes.

4. What are the consequences of violating the assumption of linear independence in statistical analysis?

If the assumption of linear independence is violated in statistical analysis, it can lead to biased and inaccurate results. This can have serious implications in fields such as economics, where linear regression models are commonly used to make important predictions.

5. How can researchers handle situations where linear independence is not met?

If linear independence is not met, researchers can use techniques such as dimension reduction, feature selection, or regularization to handle the issue. These methods help to reduce the dimensionality of the data and remove any linearly dependent features, allowing for more accurate analysis.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
952
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
4K
Replies
1
Views
3K
Back
Top