# Linear Alg. simple proofs

1. Sep 4, 2007

...involving Matrix Multiplication... I think it is mainly the notation that is killing me here...but it is killing me.

Problem: Check parts (2) and (3) of theorem (1.3.18) which says:
1. A(BC)=(AB)C
2. A(B+C)=AB+AC
3. (A+B)C=AC+BC

The author led the way on part one with this proof: Let AB=D, BC=G,
(AB)C=F, and A(BC)=H
We must show that F=H.
$$d_{ik}=\sum_ja_{ij}b_{jk}$$ and $$g_{jl}=\sum_kb_{jk}c_{kl}.$$

Hence $$f_{il}=\sum_kd_{ik}c_{kl}=\sum_k(\sum_ja_{ij}b_{jk})c_{kl}=(\sum_k\sum_j)a_{ij}b_{jk}c_{kl}$$
=$$\sum_ja_{ij}(\sum_kb_{jk}c_{kl})=\sum_ja_{ij}g_{jl}=h_{il}$$

It says to use the definition of matrix multiplication as a hint.

Maybe someone could start me off; I am teaching myself this, so it is slow going:grumpy:

Casey

Last edited: Sep 4, 2007
2. Sep 4, 2007

well the definition of multiplication is the product BA=C is
$$c_{ij}=\sum^n_{k=1}b_{ik}a_{kj},$$ $$i=1,\cdots,m$$ $$j=1,\cdots,p$$

3. Sep 4, 2007

### Avodyne

Well, I think all this D,G,F,H stuff is awfully confusing and obscures the main point. I would prove (1) like this (where I use the Einstein convention that repeated indices are summed, and the summation signs omitted):
$$[A(BC)]_{ij} = A_{ik}(BC)_{kj} = A_{ik}(B_{kn}C_{nj}) = A_{ik}B_{kn}C_{nj} = (A_{ik}B_{kn})C_{nj} = (AB)_{in}C_{nj} = [(AB)C]_{ij}$$

And your statement of (2) is wrong. Can you see what it should be?

4. Sep 4, 2007

### Staff: Mentor

2. A(B+C)=AB+BC should read A(B+C) = AB + AC

and elementwise, one has aik(bkj + ckj)

Last edited: Sep 4, 2007
5. Sep 4, 2007

thanks...typo. edited to correct. Thanks for the ideas, but I would like to stick with the format of the text. I do not know what the Einstein convention is...this is the 1st chapter of a text on the algebra of Matrices and Linear Algebra.

Thanks,
Casey

6. Sep 4, 2007

### AsianSensationK

Semi-random question. What do you think the author did with part one there?

7. Sep 4, 2007

### Dick

The Einstein convention is just a scheme for leaving out the explicit summation sigmas. It just says, if an index is repeated then sum over it. So look at Avodyne's form and if you see for example A_ij*B_jk, insert a sum over j. None these take any particular cleverness, it's just mechanical symbol manipulation.

8. Sep 5, 2007

### Fredrik

Staff Emeritus
In that case I suggest that you first solve these problems the way Avodyne solved the first one, and then convert your solution to the format used in the book. The method he suggested is by far the best one, and I completely agree with what he said here:
There is no need to introduce other symbols (D,G,F,H) here. All they do is make it more difficult for you to see what you're really doing.

The "Einstein summation convention" sounds like something fancy, but it really isn't. The definition of matrix multiplication is

$$(AB)_{ij}=\sum_k A_{ik}B_{kj}$$

Those of us who use the summation convention would write this as

$$(AB)_{ij}=A_{ik}B_{kj}$$

The only difference is that we don't bother to write the sigmas because we know that there's always a summation over those indices that appear twice. The summation convention is nothing more than that.

Here's another example of how to prove a matrix identity. Let's prove that Tr(ABC)=Tr(CAB).

$$\textrm{Tr}(ABC)=(ABC)_{ii}=A_{ij}B_{jk}C_{ki}=C_{ki}A_{ij}B_{jk}=\textrm{Tr}(CAB)$$

$$((AB)^T)_{ij}=(AB)_{ji}=A_{jk}B_{ki}=B_{ki}A_{jk}=(B^T)_{ik}(A^T)_{kj}=(B^TA^T)_{ij}$$

See how easy things get with this notation?

9. Sep 5, 2007

I think I am starting to see it now.....let me review this thread some more.

Thanks,
Casey

10. Sep 5, 2007

### susie44

sorry for posting this here, but how do i start a thread? im new here and i need help on a math project. thanks :D

11. Sep 5, 2007

### NateTG

In each forum, at the top, there's a 'new topic' button/link.