# Question in matrices

1. Dec 27, 2004

### Chen

I need to prove (or disprove) that if A(nxn) is a matrix so that the sum of elements in every row equals 1, then the sum of elements in every row of A-1 also equals 1.

I tried a bunch of examples and they all worked, so I'm assuming this is true and I'm trying to prove it:

We know that:

$$A^{-1} = \frac{ajd(A)}{|A|}$$

$$\sum_{i=1}^n a_{ki}^{-1} = \sum_{i=1}^n \frac{(adj(A))_{ki}}{|A|} = \sum_{i=1}^n \frac{(-1)^{k+i}M_{ik}}{|A|}$$

But by defintion:

$$|A| = \sum_{i=1}^n (-1)^{k+i}a_{ki}M_{ki}$$

So we get:

$$\sum_{i=1}^n a_{ki}^{-1} = \frac{\sum_{i=1}^n (-1)^{k+i}M_{ik}}{\sum_{i=1}^n (-1)^{k+i}a_{ki}M_{ki}}$$

And from here I don't know what to do. All that's left is to show that this expression equals 1... somehow. I think I need to use the fact that |A| = 1/|A-1|, no?

Thanks.

Last edited: Dec 27, 2004
2. Dec 27, 2004

### Cyrus

Hi Chen,

Notice that you can factor this: $$\sum_{i=1}^n a_{ki}^{-1} = \frac{\sum_{i=1}^n (-1)^{k+i}M_{ik}}{\sum_{i=1}^n (-1)^{k+i}a_{ki}M_{ki}}$$.

Notice that even though the $$M_{ik} and M_{ki}$$ are different, you are summing them. So who cares how the subscripts are ordered. In the end, when you add them all up you will still get the same anwser. Its like adding the numbers 1,2,3,4,5 or adding 5,4,3,2,1. Still the same sum 15 either way. The same applies for the -1^k+i part. so you are left with:

$$\sum_{i=1}^n a_{ki}^{-1} = \frac{1}{\sum_{i=1}^n a_{ki}}$$.

Now we know that:

$$\sum_{i=1}^n a_{ki}}$$ is equal to one.

so 1/" " also equals to one.
And this implies that the rows of A-1 also equal to one. I dont know, i just took a guess at finishing what you already did. Im probably wrong, but I hope it might lead you in the right direction.

I think you might want to be a little careful in your notation. In the determinant of A it should not be $$M_{ik}$$. The determinant goes across a fixed row or column, so one of the two has to be a fixed number, such as $$M_{1k} or M_{i1}$$. This means that we should do cramers rule for only one row also. And we can show that it is true for one *row*, so similarly it will be true for all subsequent *rows*. So you should revise your work so that it does not include ALL rows and ALL colums, but a specific row and colum, such as $$A_{-1}$$ for the first row, and similalry use cramers rule for only the first row. Then do the determinant along the first row as well. This will alow for the factoring out of those M's i mentioned earlier.

Last edited: Dec 27, 2004
3. Dec 27, 2004

### dextercioby

Your way of doing it is wrong.You actually "symplified in sum" which is mathematically incorrect.

Write from the first formula u posted:
$$\sum_{i=1}^{n} (-)^{i+k} (a_{ki}^{-1}a_{ki}M_{ki}-M_{ik})=0$$
Sum this equality over "k":
$$\sum_{k=1}^{n}\sum_{i=1}^{n} (-)^{i+k}(a_{ki}^{-1}a_{ki}M_{ki}-M_{ik})=0$$
and just now u can use the fact that the matrix indices "i" and "k" are BOTH SUMMED OVER and u get:
$$\sum_{k=1}^{n}\sum_{i=1}^{n} (-)^{i+k}(a_{ki}^{-1}a_{ki}-1)M_{ki}=0$$
In order that the previous equality to be an identity,i.e.hold for any nonssingular matrix with arbitrary nonzero matrix elements,u have:
$$a_{ki}^{-1}a_{ki}-1 =0$$
,from where:
$$a_{ki}^{-1}=\frac{1}{a_{ki}}$$
Sum the previous relation after 'i' and find:
$$\sum_{i=1}^{n} a_{ki}^{-1}=\sum_{i=1}^{n} \frac{1}{a_{ki}}$$
,which is different from 1,as the sum of inverses of "n" numbers is in general different form the sume of the "n" numbers.

Daniel.

PS.I guess that dissproves the assertion.

Last edited: Dec 27, 2004
4. Dec 27, 2004

### Cyrus

Hey dex,

First Question, how come you cannot simplify sigmas like that? I dont see what would be wrong with it if they both sum to the same number.

Second, How come you introduce0 a sigma with respect to k? Isint concerning only (i) enough, since we want to prove that the rows both equal to one? If we add sigma with respect to k, doesnt that mean we add all the terms in the entire matrix, when the origional question asks for the entries along a row?

You said that $$\sum_{i=1}^{n} a_{ki}^{-1}=\sum_{i=1}^{n} \frac{1}{a_{ki}}$$
is in general not equal, but for our case, we KNOW that the row sums to 1, so doesnt that mean that it is true in this case?

But we since we know that the numbers sum to 1 as i varies for a constant k, that means that A inverse also sums to one as well. And in that special case they two sides of the equation are equal.

Thanks,

Last edited: Dec 27, 2004
5. Dec 27, 2004

### dextercioby

1.Note that those sums had the same number of terms,but the elements (factors) were different,the denominator and the numerator had nothing in common,except for that (-)^{some power} which was itself summed over and therefore it could not not have been symplified.So,symplifying is outta question.
2.As M_{ik} is in general different than M_{ki} (the matrix is arbitrary,not necessarily symmetric),i summed over the remaining free index to get me the factorization needed.U can interchange the 2 sums and relabel i->k and k->i to get the result i pictured,namely the factorization i needed.From there on,it was symple algebra.

Daniel.

PS.In general:
$$\sum_{i} \frac{A_{i}}{B_{i}} \neq \frac{\sum_{i}A_{i}}{\sum_{i}B_{i}}$$

$$\frac{\sum_{i} a_{i}A_{i}}{\sum_{i}A_{i}}}\neq \sum_{i} a_{i}$$

6. Dec 27, 2004

### dextercioby

I said it is equal,as in ALWAYS EQUAL,REGARDLESS of the value for $\sum_{i} a_{ki}$.Put that sum in a more familiar way:
$$\frac{1}{a_{ki}^{-1}}=a_{ki}$$
Sum this relation over "i":
$$\sum_{i=1}^{n} \frac{1}{a_{ki}^{-1}}=\sum_{i=1}^{n} a_{ki} =1$$
Last equality states clearly that the sum of the inverses is 1,which does not mean that the sum of the numbers themselves (matrix elements from a row) is equal to 1,viz.
$$\sum_{i=1}^{n} \frac{1}{a_{ki}^{-1}}=1 \nRightarrow \sum_{i=1}^{n} a_{ki}^{-1} =1$$

Read VERY,VERY CAREFULLY what i've just written.
I always assumed this is much more simple than line integrals... :tongue2:

Daniel.

Thanks,[/QUOTE]

7. Dec 27, 2004

### Cyrus

.

Thats why I thought you said it is not true in general, and I said, but it is true when they sum up to equal one.

What can I say, Im one of the special needs kids . Your so smart dex, my theory is that if I pick your brain long enough, I might learn something.

Last edited: Dec 27, 2004
8. Dec 28, 2004

### Chen

That's what it looks like, but if the assertion is indeed false I have to find an example in order to disprove it. I've been out of luck so far. I'm using this online calculator:
http://www.freemathhelp.com/calc-inverse-matrix.html
And without exception for every matrix I entered, and some were very bizzare, the inverse satisifed the requirements. For example:
1,2,3,-5
2,-7,2,4
-4,-2,8,-1
0,1,2,-2

So there's still doubt in my mind that the assertion is false... :shy:

Thanks!

Last edited: Dec 28, 2004
9. Dec 28, 2004

### Chen

I'm thinking of proving this in a different way. We learned that an easy way to find the inverse of A, is to put A and I in the same matrix (size n by 2n) like so:
(A|I)
And then perform elementary row operations on the matrix, until A becomes I and I becomes A-1:
(I|A-1)
(This is just an easier method to perform the same row operations that get A to I on I to find the inverse.)

We know that the sum of elements in every row (let's call it SUM from now own) in A is 1, and so is the SUM in I is 1. This means that when we perform these row operations on A to turn it into I, the SUM doesn't change. Perhaps it's possible to show that this feature is true for every matrix with SUM=1, i.e if you take a matrix with SUM=1 and perform these operations the SUM will stay 1. (And then it's obvious why the SUM in a-1 is also 1.) In other words, to show that if the SUM is preserved on one side of this matrix:
(A|I)
(which is A) then it must be preserved also on its other side (which is I).

What do you think?

10. Dec 28, 2004

### Chen

Well I was just shown the proof of this. It's too simple to be good. :rofl:

The sum of elements in every row of A is 1, so:
$$A\left(\begin{array}{c}1\\1\\..\\1\end{array}\right) = \left(\begin{array}{c}1\\1\\..\\1\end{array}\right)$$

Multiply by A-1 from left:
$$A^{-1}A\left(\begin{array}{c}1\\1\\..\\1\end{array}\right) = \left(\begin{array}{c}1\\1\\..\\1\end{array}\right) = A^{-1}\left(\begin{array}{c}1\\1\\..\\1\end{array}\right)$$

Done. :)

11. Dec 28, 2004

### dextercioby

Good one. :tongue2: I was wrong there,i didn't realize that:
$$(\sum_{i=1}^{n} a_{i})(\sum_{i=1}^{n} b_{i}) \neq \sum_{i} a_{i}b_{i}$$
.What an idiot... :yuck:

Daniel.