Can you rearrange vectors in a set? And another misc questn.

  • Context: Undergrad 
  • Thread starter Thread starter CookieSalesman
  • Start date Start date
  • Tags Tags
    Rearrange Set Vectors
Click For Summary
SUMMARY

The discussion centers on the rearrangement of vectors within a set and its implications for linear dependence and independence in linear algebra. It is established that while the order of vectors in a spanning set does not affect their linear dependence, the ordering becomes significant when representing vectors in a basis. Specifically, the theorem states that an indexed set of vectors is linearly dependent if at least one vector can be expressed as a linear combination of others. The permutation of vectors does not alter the span, but it does impact matrix operations, such as solving equations involving those vectors.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically vector spaces and linear dependence.
  • Familiarity with matrix operations, including matrix multiplication and solving linear equations.
  • Knowledge of basis representation in vector spaces.
  • Basic understanding of the concept of linear combinations.
NEXT STEPS
  • Study the concept of linear independence and dependence in detail.
  • Learn about the properties of vector spaces and their bases.
  • Explore matrix operations, particularly the effects of column swapping on solutions to linear equations.
  • Investigate the implications of vector ordering in the context of linear transformations.
USEFUL FOR

Students and educators in linear algebra, mathematicians interested in vector space theory, and anyone involved in computational mathematics or applied linear algebra.

CookieSalesman
Messages
103
Reaction score
5
Suppose you have a set of vectors v1 v2 v3, etc.

However large they are, suppose they span some area, which I think is typically represented by

Span {v1, v2, v3}
But I mean, if you're given these vectors, is there anything wrong with rearranging them? Because there's a theorem- that
"an indexed set S= {v1, v2... vp} of more than one vectors is linearly dependent if at least one vector is in a linear combination of the others."
So if S is linearly dependent, any vector in the set is a combination of the preceeding vectors?
Or did I read that wrong, and it just means a certain vector, possibly more than one is a lin comb of some other vectors?

However the theorem I'm reading seems to really detail that there's something special about "preceeding vectors". So if you have any set, is interchanging vectors allowed?
I feel like that there is nothing wrong with this. Is there some time when this is allowed and it isnt, maybe?
(I've just started linear algebra for a few weeks so I don't know any complex scenarios)

But it seems that this theorem suggests that there's something important to the permutation of these vectors.
 
Physics news on Phys.org
Reordering the vectors in a spanning set has no effect. There's nothing wrong with it.

When we talk about a vector space basis, we may wish to imply an ordering, because without an ordering, we cannot speak unambiguously of the representation of a vector in that basis, which we often wish to do. If we take that definition of 'basis' then, for every set of linearly independent, spanning vectors in an n-dimensional vector space, there are n! different normalized bases, corresponding to the number of ways the vectors could be reordered.

My guess is that the reference to 'preceding' is just about the method by which one tests linear independence. One way to do that is to label the vectors as v1, v2, ... , vn. Then test that v2 is independent of v1, Next test that v3 is independent of v1 and v2, and so on. But that ordering is just a convenience used in performing the test, not an intrinsic requirement of the set.
 
Thanks.
 
But for instance, does this mean if you try to solve a matrix of [v1 v2 v3] and a matrix with just rearranged vectors like [v3 v1 v2]... it's the same??
 
By 'solve a matrix' do you mean calculate its (multiplicative) inverse? If so then, no, the answer is not the same.
 
Ummmm I'm not sure.

Does it make a difference how you solve it?

For instance I've only learned about Ax=b, using the matrix A as a function. And also solving span{v1 v2 v3}=0, to test for interdependence.

I don't know about what the inverse is.

But maybe I meant if you switch the positions of vectors in a set, isn't that equivalent to swapping collumns in a matrix? In that case, then a matrix of [v1 v2 v3] is equivalent in any respect to [v3 v1 v2]?
 
It's equivalent in the sense that
$$[v3\ v1\ v2] =
[v1\ v2\ v3]
\left( \begin{array}{ccc}
0 & 1 & 0 \\
0 & 0 & 1 \\
1 & 0 & 0 \end{array} \right)$$
[Or something like that. I often get my rows and cols muddled up in matrix mults]Equation Ax=b will have a completely different solution from A*x=b where A* is A with shuffled columns.
However it will have the same solution as A*x=b*, where b* is b with the same shuffle applied to its components as was applied to the columns of A to get A*.
 
Oh alright. Thanks.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
4K
Replies
3
Views
5K
  • · Replies 8 ·
Replies
8
Views
1K