# Linear independence over Z

1. Feb 6, 2005

### T-O7

Hey all,
I need to show whether or not the following statement is true:

For $$v_1,...,v_n\in Z^m$$, the set $$\{v_1,...,v_n\}$$ is linearly independent over Z $$\Leftrightarrow$$ it is linearly independent over R.

The reverse direction is true of course, but i'm having some trouble showing whether or not the forward direction is true. I'm pretty sure if R was replaced by Q, then the statement would be true, so the question boils down to irrational coefficients i think. I'm stuck at this point. Any thoughts?

2. Feb 6, 2005

### Hurkyl

Staff Emeritus
Try a simpler case first. What about m = 1? m = 2?

3. Feb 9, 2005

### T-O7

okay, i get a little confused even with the case m=1.
For the case m=1 we consider the vectors v as just elements in Z, and the question is: if they're linearly independent over Z, are they linearly independent over R? My issue right now though is that they can't even be linearly independent over Z to begin with! Just taking two vectors v1 and v2, the linear combination
$$v_2(v_1) - v_1(v_2) = 0$$
shows they're dependent. So....for the case m=1, does that mean n must be precisely 1? For which case if {v1} is linearly independent over Z (i.e.v1 not 0), then it's of course linearly independent over R. Does this make any sense?

4. Feb 9, 2005

### mathwonk

looks good. now try it for vectors in Z^2. i think i got it for two vectors there anyway.

5. Feb 10, 2005

### matt grime

Think about the matrix whose columns are the vectors. The row operations you do to see if they are Linearly independent can all be done over Z, if the entries are all in Z.

6. Feb 10, 2005

### T-O7

Ah, I hadn't thought of that!
So following what matt suggested, if A is the matrix whose columns are the v's, we get a diagonal matrix through a series of elementary matrix multiplications, which don't affect the rank of A, and so if the columns are independent (over Z), then A must have determinant +-1 if it's to be invertible (over Z). Since this also implies invertibility over R (detA = 1 != 0), the columns are linearly independent over R. I think that works? Thanks for the tip!

7. Feb 10, 2005

### matt grime

No, that isn't it. The two vectors (1,0) and (0,2) are LI in Z, and in R, but the matrix formed by them is not invertible as a matrix over Z.

8. Feb 10, 2005

### T-O7

.....hmph. So the fact that a n×n matrix is invertible iff it has rank n is false over Z? I think i might've also made the mistake of assuming m=n in my argument above...

9. Feb 10, 2005

### matt grime

Invertible over Z is different from rank, yes - you may not divide rows by arbitrary numbers any more to get a matrix of determinant 1

10. Feb 10, 2005

### mathwonk

thats a nice idea. if i get it, i think Matt is saying this:

suppose an integer matrix has Z independent rows. then no row operations using only integers can ever eliminate a row. (you have to prove this.)

But there are row operations using only integers that put the matrix in echelon form, not necessarily reduced echelon form, since no dividing is allowed. (You have to prove this too.)

It is then clear that the rows are also R independent. (You also have to prove this.)

11. Feb 10, 2005

### T-O7

Hmm...now that i think about it, i don't think i can give an argument using determinants anyway (i.e showing that the determinant is non-zer0), since i can't assume that m=n.

12. Feb 10, 2005

### T-O7

But there's a theorem at my disposal:
if we assume the v's are independent over Z, and A is the matrix whose columns are the v's, then through a series of elementary integer matrix multiplications we get a diagonal matrix, of the form
$$A' = \left(\begin{array}{cc}D&0\\0&0\end{array}\right)$$
where D is a diagonal matrix such that each entry divides the next
(this is by the theorem).
Then A' must have the same rank as A, i.e there must be m diagonal entries in D. (elementary integer matrices preserve rank?)
Using now elementary matrix operations over R, we can turn this into:
$$A'' = \left(\begin{array}{cc}I_m&0\\0&0\end{array}\right)$$
(just divide each row by the diagonal element)
A'' has rank m, which is the rank of A, so the columns of A (the v's) are linearly independent over R.
This sounds a little weird.....have I done something horribly wrong?

13. Feb 10, 2005

### mathwonk

no. you are modifying what i said above, but thats all right. Matt's idea as i understood it was to use only row operations, and get an echelon form, and instead you are using also column operations and getting diagonal form, but that is ok.

it would have been nice however if you had done the reasoning yourself, which is what i was trying to lead you to do, instead of just invoking a "theorem".

it really isn't very hard to prove if you know what the elementary row operations are. but maybe you do know how to prove the theorem?

I.e. row operations let you clear out the columns above and below the "pivot variables. if you also use column operations, you can also clear out the rows to the right of the pivot variables and be left with only the pivot variables, i.e. a diagonal matrix with as many pivots as the rank.

in a way that is actually cleaner than my version of Matt's suggestion, but more work too. I don't know which one Matt meant to suggest.

14. Feb 10, 2005

### T-O7

Yeah, i should have explained earlier, this theorem was discussed in class, and I suppose this question was asked in hopes of understanding that the theorem was applicable.

15. Feb 10, 2005

### mathwonk

you also have a harder time understanding why this implies your result, since using both row and column operations muddles the idea of linear combinations.

I.e. if you noly use row operations, then the new rows are still linear combinationsa of the old rows, so it is clear that you can get a zero row if and only if the old rows were linearly dependent.

using both types of operations you have to knowthat what is being preserved is the map define by the matrix, but in a different basis of both domain and range.

so then you need to know that independent columns corresponds to an injective map. this is to me a little more sophisticated.

so i would satill recommend to you the outline argument I gave above, right after Matt's original suggestion.

16. Feb 10, 2005

### T-O7

Ok, i think i see what you're saying....in getting the diagonal matrix A'' from A, i had tacitly assumed that the elementary integer matrices are rank-preserving over Z.....and i really can't find a way to convince myself whether or not that's true.

I'm going to try your outline now, i think it might be easier after all, thanks

17. Feb 10, 2005

### mathwonk

they are rank preserving, but what does that have to do with linear independence of the columns? they are not linear combinations of the old columns unless you use only column operations, and you used both row and column operations.

18. Feb 10, 2005

### T-O7

So if they are rank preserving, then the diagonal matrix A'' will have the same rank as A over Z, which is m (since the v's are linearly independent over Z), and since A'' clearly has rank m over R also, applying the inverses of the same integer elementary operations will give me back A, and I can once again say that A has rank m over R. Is that not correct?

19. Feb 10, 2005

### mathwonk

what is your definition of rank? and how do you prove those operations preserve rank? you seem to keep trying to invoke some magic recipe instead of just going back to first principles and simply proving it.

20. Feb 10, 2005

### T-O7

By rank i mean the maximum number of linearly independent rows/columns of the matrix.
I guess you have to be careful about terminology when you're dealing with modules instead of vector spaces, I just find it a little confusing...I understand your point though, and i'll try to follow your line of reasoning. Thanks.