# Linear Algebra Problem #2 (Matrices actually)

1. Jun 24, 2008

1. The problem statement, all variables and given/known data

If a diagonal matrix is a square nxn matrix whose only nonzero entries occur in positions $a_{ii},\ i=1,2,....,n$ prove that the product of two nxn diagonal matrices, D1 and D2, is diagonal AND that D1 and D2 commute.

(HINT: Use D1=[$d_{ij}$] and D2=[$d'_{kl}$]

use the defintion of matrix multiplication to show that there is at most one non zero term in $$c_{il}=\sum_jd_{ij}d'_{jl}$$)

2. Relevant equations

Definition of matrix multiplication (This is where I have the most trouble)

For C=BA, $$c_{ik}=\sum_{k=1}^nb_{ik}a_{kj},\ i=1,....,m\ j=1,...,p$$

So my biggest trouble stems from these really general definitions :yuck: I mean, I know they shouldn't be to difficult...I can multiply matrices with actual numbers in them, but this definition is giving me grief.

Can someone just kind of talk about this defintion with me for a minute? Then I can make a real attempt at a solution.

Thanks!

2. Jun 24, 2008

This definition is stupid. Why are there so many indexes for god's sake?

3. Jun 24, 2008

### dx

It's not really very complex. You know that the (i,j)th element of C is just the dot product of the ith row of B and the jth column of A. Also, the row is specified by the first index and the column by the second index. The elements of the ith row of B are $$b_{ik}$$ where k goes from 1 to n. The elements of the jth column of A are $$a_{kj}$$ where k goes from 1 to n. Now the dot product is simply

$$\sum_{k=1}^nb_{ik}a_{kj}$$.

4. Jun 24, 2008

### Dick

a_{ik} is nonzero only if i=k if a is diagonal. b_{kj} is nonzero only if j=k if b is diagonal. c_{ij}=a_{ik}*b_{kj} with k summed over. i and j are NOT summed over. They are constants. The product can only be non-zero if both terms are nonzero. So i=k and k=j. If i is not equal to j, then there is no such k. Hence there are no nonzero terms in the sum. The sum must equal 0. So c_{ij}=0 if i is not equal to j. If i=j there is only one term in the sum, where k=i=j. So c_{ij}=c_{ii}=a_{ii}*b_{ii}. There. I talked about it for a minute. I timed it. You tell me why it's commutative. I.e. why is a_{ik}*b_{kj}=b_{ik}*a_{kj}. Review the previous. There are two cases, i=j and i not equal to j.

5. Jun 24, 2008

Looks like you did a lot of typing there in a minute! I wish I followed you. This is the definition I am having trouble with:

$$c_{ik}=\sum_{k=1}^nb_{ik}a_{kj},\ i=1,....,m\ j=1,...,p$$

I am having difficulty with this:
Does this mean that the only thing that changes in the summation is k for any particular ij entry? I think that is what you mean.

So I guess I understand that part now.....moving on to the rest of what you wrote

6. Jun 24, 2008

### Dick

Yes, that's what I mean. But you meant c_{ij} in what you posted, yes? There are a lot of indices, and it's important to keep them straight. Actually, I wrote that in a minute, then spent several minutes correcting the index typos.

7. Jun 24, 2008

Yes. Should be $$c_{ij}=\sum_{k=1}^nb_{ik}a_{kj},\ i=1,....,m\ j=1,...,p$$

But I did not notice that I copied it wrong from the book until you just pointed that out!

Damn. Now I need to go over it again.

8. Jun 24, 2008

### Dick

And the matrix is nxn. So m=n and p=n. I believe you could handle this easily, as you said, if they were real matrices with real numbers in them. You just have to figure out how to abstract that knowledge to this notation. It's really saying the same thing. In a more confusing way.

9. Jun 24, 2008

Yeah..... it's just about to 'click'. I understood it for a second and then I lost it. So, I am going to get a snack. I will understand it when I come back. I can feel it!

10. Jun 25, 2008

Alrighty-then So I can intuitively see that the product of two nxn diagonal matrices is an nxn diagonal matrix. You've got a bunch of zeros everywhere in the matrices except along the major diagonals, so it's just kind of obvious.

Now, how to use the definition to write out a proof is proving to be a little more challenging as I am not used to the 'mechanics' and formalisms of matrices.

How do you start a proof?

11. Jun 25, 2008

### chaoseverlasting

One of the things you can do is that since both matrices are diagonal matrices, you know that only elements for which i==j are non zero. As every thing else is zero, all other products will yield zero. So, you dont really need to evaluate the general definition, so long as you know how matrix multiplication is done, you can kind of visualize it and put that process in terms of the general definition.

12. Jun 25, 2008

Like I said, I get the intuitive sense of it. But, how do I demonstrate the general case? I just need a hint as to what to start writing here. I want to try it on my own, but I am not sure how to start a proof.

13. Jun 25, 2008

### Defennder

It's not really something that is formal, but here's what you can do. Denote the first diagonal matrix by A and the other B. The entries of the matrices A and B are $$a_{ij} \ \mbox{and} \ b_{ij}$$

You know for all $$a_{ij} , b_{ij} = 0$$ unless $$i=j$$. So how can you write out the matrix multiplication of these two matrices in this "notation"? What is the matrix entry of C (matrix AB): $$c_{ij}$$? After this is done, work out what a matrix entry of C' (matrix BA) should look like.

This should get you started.

14. Jun 25, 2008

### Dick

Casey, what I wrote in post #4 IS a proof. If you intuitively see something, just write clearly WHY it is intuitively clear and the result is likely to be a pretty acceptable proof.

15. Jun 25, 2008

Sounds good! I just always liked being able (from the little experience I have with proofs) to write out a proof symbolically without any words. I always thought that was the goal of a 'proof.' To show something is true or not true purely through symbols.

I should probably look at that 'hint' again if I still wish to do so.

16. Jun 25, 2008

BTW: Looking back in the book, this: $$c_{il}=\sum_jd_{ij}d'_{jl}$$

is actually written like this: (note the position of the j index after Sigma)

$c_{il}=\sum\ _jd_{ij}d'_{jl}$

Is there a difference?

Usually when I see Sigma, there is a number or dummy variable underneath to give a starting point and then a number or dummy ABOVE it to denote the stopping point. Why is there not a stop on this one?

17. Jun 25, 2008

I did not see this post Defennder.
This is where I get stuck... is this just the definition of multiplication:

$$c_{ij}=\sum_{k=1}^nb_{ik}a_{kj},\ i=1,....,m\ j=1,...,p$$
?

18. Jun 25, 2008

### Dick

It may just be typographical sloppiness. I think those all mean the same thing. If all the matrices are nxn then it appears they omit the n for the upper limit. Uh, some people would omit the Sigma as well, and say the presence of the j index twice is enough reason to presume it will be summed. That's Einstein summation convention. It saves you a lot of time making Sigma's as these expressions get more and more complicated. So don't be too surprised they are getting sloppy...

19. Jun 25, 2008

I seriously want to bash the life out of whoever wrote this book. How the hell am I supposed to use the definition of matrix multiplication: $C=BA=c_{ij}=\sum_{k=1}^nb_{ik}a_{kj},\ i=1,....,m\ j=1,...,p$

to prove this using D1=d_ij and D2=d'_jl by showing there is at most one non zero term in $$c_{il}=\sum_jd_{ij}d'_{jl}$$

Exactly WHAT IS c_il ? Is it one particular column in C ?

Basically, I am asking the question over, but I want to use the books way of answering it.

I am not trying to sound ungrateful for the help I have received thus far. It's just that I am looking ahead in the book and every goddamned question is just like this one... and if I don't get their methods now, I am going to be posting every single question in the book on PF. And I just can't bare to stare at a computer screen for that much time.

20. Jun 25, 2008

### Defennder

I take it that you mean p=m. Yes, and how about the entries of matrix AB? Examining the summation, which summands are non-zero? Clearly $a_{ik}b_{kj} = 0$ iff either a_{ik} or b_{jk} is zero.