# Power of a Diagonalized Matrix?

1. May 24, 2014

### kq6up

1. The problem statement, all variables and given/known data

From Mary Boas' "Mathematical Methods in the Physical Sciences 3rd Ed."

Chapter 3 Section 11 Problem 57

Show that if $$D$$ is a diagonal matrix, then $$D^{n}$$ is the diagonal matrix with elements equal to the nth power of the elements of $$D$$.

2. Relevant equations

I think $D^{n}=C^{-1}M^{n}C$

3. The attempt at a solution

I think this hints at this being the case, but I don't think it proves it:

$$Tr(D^{n})=Tr(C^{-1}CM^{n})=Tr(M^{n})$$

I also tried using the summation form of matrix multiplication, but I am not familiar enough with the formalism to feel confident about what I am doing.

Furthermore, I think it is rather obvious that this is the case if one just multiplies a diagonal matrix by itself.

Any hints as to a different approach?

Thanks,
Chris Maness

2. May 24, 2014

### Dick

I think it's rather obvious too. But use index notation for the matrix product to prove it. You may also want to do a simple induction.

3. May 24, 2014

### kq6up

Another quick thought. Since $$[det(D)]^{n}=det(D^{n})$$, would that not have to be the case?

Chris

4. May 24, 2014

### HallsofIvy

No, many different matrices have the same trace or determinant so neither of those is sufficient to show that the matrices are the same.

5. May 24, 2014

### kq6up

6. May 24, 2014

### Dick

7. May 24, 2014

### kq6up

So I need a Kronecker delta, correct? I imagine I would use both pi and sigma notation. If I finish this, it merely for the sake of having a better understanding of using pi and sigma notation, not because I will gain insight into $$D^{n}$$.

Chris

8. May 24, 2014

### Dick

I don't see any reason to make it THAT formal. Just comment that you must have $i_0=i_1=i_2=...=i_n$ and explain why it must be true.

9. May 24, 2014

### kq6up

When I see nested sigmas, my eyes cross. I might be making it more difficult than it is. I have to picture the thing making the said matrix in my minds eye, and if the indices are more complex than the most straight forward examples, I have difficulty. Maybe I need some exercises in this type of operation that builds up in a moderate learning curve to build confidence and intuition. Any suggestions on a work sheet or something of that nature, or a reference to an on-line book.

Thanks,
Chris Maness

10. May 24, 2014

### Dick

This one isn't that hard. The only term in that big sum that could be nonzero is when all the indices are equal.

11. May 24, 2014

### kq6up

That is why I need the practice -- because I am so bad at seeing the obvious with this stuff :D

Chris Maness

12. May 24, 2014

### kq6up

This problem is really messing with my head. It is obviously true, but I am completely at a loss as how to write the summation in a general way for an arbitrary power. I can do it for $$A^{2}$$, but not for an arbitrary power. I don't want the answer, I just need to get a better feel for doing general proofs with summation and/or product notation. I really don't have any experience with this.

Here is my solution for a diagonal matrix of ANY size multiplied to itself:

$${ \left[ { A }^{ 2 } \right] }_{ ij }=\sum _{ k=1 }^{ n }{ { A }_{ ik }{ A }_{ kj }{ \delta }_{ ik }{ \delta }_{ jk } }$$ where n is the size of the matrix.

Any hints appreciated.

Thanks,
Chris Maness

Last edited: May 24, 2014
13. May 24, 2014

### Dick

Are you just trying to make this harder than it is?
${ \left[ { A }^{ 2 } \right] }_{ ij }=\sum _{ k=1 }^{ n }{ { A }_{ ik }{ A }_{ kj } }$. That's it. No Kroneckers. It's the definition.

Last edited: May 24, 2014
14. May 25, 2014

### kq6up

Yes, I see that now because they are already zeros.

Chris

15. May 25, 2014

### Dick

Ok, so can you give an argument that shows what you want for $D^2$?

16. May 25, 2014

### kq6up

Ok I am going to go for the more general case here. Here is my new summation formula (don't laugh if it looks stupid -- I don't know what I am doing here):

$${ \left[ { A }^{ n } \right] }_{ ij }=\prod _{ l=1 }^{ n }{ \begin{bmatrix} { A }_{ 11 } & 0 & \cdots \\ 0 & \ddots & 0 \\ \vdots & 0 & { A }_{ ss } \end{bmatrix} } =\sum _{ k=1 }^{ s }{ \prod _{ l=1 }^{ n }{ { A }_{ kk } } } ={ A }_{ 11 }^{ n }+\cdots +{ A }_{ ss }^{ n }$$

1. If a diagonal matrix multiplies to any other matrix, all off diagonal elements are destroyed by zeros in the first diagonal matrix.

2. When diagonal matrices are multiplied elements in the same row/column are multiplied together.

3. If a diagonal matrix is multiplied to itself, each diagonal element is squared.

∴ A diagonal matrix to the nth power just raises each diagonal matrix to the nth power.

Does this work?

Thanks,
Chris Maness

17. May 26, 2014

### Dick

No, it doesn't work. Where did the summation come from? You are supposed to argue that $(D^n)_{ij}=(D_{ij})^n$ for D a diagonal matrix. Go back and look at the formula you gave for the matrix product and tell me why it's true for n=2.

18. May 26, 2014

### kq6up

Does the argument work for A^2?

Chris

19. May 26, 2014

### Dick

NO! Start from the formula in post 13 and say what happens if A is diagonal.

20. May 26, 2014

### kq6up

If A is diagonal, then the diagonal elements are simply squared. I saw that a long time ago, I am just having trouble formalizing it.

Chris