How Can I Prove Properties of Matrix Multiplication?

  • Thread starter Thread starter Saladsamurai
  • Start date Start date
  • Tags Tags
    Linear Proofs
Click For Summary

Homework Help Overview

The discussion revolves around proving properties of matrix multiplication, specifically focusing on the associative and distributive properties as stated in a theorem. The original poster expresses confusion regarding the notation and the proof structure presented in their text.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants discuss the notation used in matrix multiplication proofs, with some suggesting alternative methods to clarify the concepts. There are questions about the correctness of the original poster's statement of the theorem and the use of the Einstein summation convention.

Discussion Status

Participants are actively engaging with the original poster's confusion, offering insights into the notation and suggesting that they follow a specific approach to understand the proofs better. There is a recognition of the need to simplify the notation to focus on the main points of the proofs.

Contextual Notes

The original poster is self-studying from a textbook on matrices and linear algebra, indicating a potential lack of familiarity with certain conventions and notations used in the field.

Saladsamurai
Messages
3,009
Reaction score
7
...involving Matrix Multiplication... I think it is mainly the notation that is killing me here...but it is killing me.

Problem: Check parts (2) and (3) of theorem (1.3.18) which says:
1. A(BC)=(AB)C
2. A(B+C)=AB+AC
3. (A+B)C=AC+BC


The author led the way on part one with this proof: Let AB=D, BC=G,
(AB)C=F, and A(BC)=H
We must show that F=H.
d_{ik}=\sum_ja_{ij}b_{jk} and g_{jl}=\sum_kb_{jk}c_{kl}.

Hence f_{il}=\sum_kd_{ik}c_{kl}=\sum_k(\sum_ja_{ij}b_{jk})c_{kl}=(\sum_k\sum_j)a_{ij}b_{jk}c_{kl}
=\sum_ja_{ij}(\sum_kb_{jk}c_{kl})=\sum_ja_{ij}g_{jl}=h_{il}

I think I follow that.

It says to use the definition of matrix multiplication as a hint.

Maybe someone could start me off; I am teaching myself this, so it is slow going

Casey
 
Last edited:
Physics news on Phys.org
well the definition of multiplication is the product BA=C is
c_{ij}=\sum^n_{k=1}b_{ik}a_{kj}, i=1,\cdots,m j=1,\cdots,p
 
Well, I think all this D,G,F,H stuff is awfully confusing and obscures the main point. I would prove (1) like this (where I use the Einstein convention that repeated indices are summed, and the summation signs omitted):
[A(BC)]_{ij} = A_{ik}(BC)_{kj} = A_{ik}(B_{kn}C_{nj}) = A_{ik}B_{kn}C_{nj} = (A_{ik}B_{kn})C_{nj} = (AB)_{in}C_{nj} = [(AB)C]_{ij}

And your statement of (2) is wrong. Can you see what it should be?
 
2. A(B+C)=AB+BC should read A(B+C) = AB + AC

and elementwise, one has aik(bkj + ckj)
 
Last edited:
Avodyne said:
Well, I think all this D,G,F,H stuff is awfully confusing and obscures the main point. I would prove (1) like this (where I use the Einstein convention that repeated indices are summed, and the summation signs omitted):
[A(BC)]_{ij} = A_{ik}(BC)_{kj} = A_{ik}(B_{kn}C_{nj}) = A_{ik}B_{kn}C_{nj} = (A_{ik}B_{kn})C_{nj} = (AB)_{in}C_{nj} = [(AB)C]_{ij}

And your statement of (2) is wrong. Can you see what it should be?
thanks...typo. edited to correct. Thanks for the ideas, but I would like to stick with the format of the text. I do not know what the Einstein convention is...this is the 1st chapter of a text on the algebra of Matrices and Linear Algebra.

Thanks,
Casey
 
Semi-random question. What do you think the author did with part one there?
 
Saladsamurai said:
thanks...typo. edited to correct. Thanks for the ideas, but I would like to stick with the format of the text. I do not know what the Einstein convention is...this is the 1st chapter of a text on the algebra of Matrices and Linear Algebra.

Thanks,
Casey

The Einstein convention is just a scheme for leaving out the explicit summation sigmas. It just says, if an index is repeated then sum over it. So look at Avodyne's form and if you see for example A_ij*B_jk, insert a sum over j. None these take any particular cleverness, it's just mechanical symbol manipulation.
 
Saladsamurai said:
Thanks for the ideas, but I would like to stick with the format of the text. I do not know what the Einstein convention is...this is the 1st chapter of a text on the algebra of Matrices and Linear Algebra.
In that case I suggest that you first solve these problems the way Avodyne solved the first one, and then convert your solution to the format used in the book. The method he suggested is by far the best one, and I completely agree with what he said here:
Avodyne said:
this D,G,F,H stuff is awfully confusing and obscures the main point.
There is no need to introduce other symbols (D,G,F,H) here. All they do is make it more difficult for you to see what you're really doing.

The "Einstein summation convention" sounds like something fancy, but it really isn't. The definition of matrix multiplication is

(AB)_{ij}=\sum_k A_{ik}B_{kj}

Those of us who use the summation convention would write this as

(AB)_{ij}=A_{ik}B_{kj}

The only difference is that we don't bother to write the sigmas because we know that there's always a summation over those indices that appear twice. The summation convention is nothing more than that.

Here's another example of how to prove a matrix identity. Let's prove that Tr(ABC)=Tr(CAB).

\textrm{Tr}(ABC)=(ABC)_{ii}=A_{ij}B_{jk}C_{ki}=C_{ki}A_{ij}B_{jk}=\textrm{Tr}(CAB)

How about the identity (AB)T=BTAT?

((AB)^T)_{ij}=(AB)_{ji}=A_{jk}B_{ki}=B_{ki}A_{jk}=(B^T)_{ik}(A^T)_{kj}=(B^TA^T)_{ij}

See how easy things get with this notation?
 
Fredrik said:
In that case I suggest that you first solve these problems the way Avodyne solved the first one, and then convert your solution to the format used in the book. The method he suggested is by far the best one, and I completely agree with what he said here:

There is no need to introduce other symbols (D,G,F,H) here. All they do is make it more difficult for you to see what you're really doing.

The "Einstein summation convention" sounds like something fancy, but it really isn't. The definition of matrix multiplication is

(AB)_{ij}=\sum_k A_{ik}B_{kj}

Those of us who use the summation convention would write this as

(AB)_{ij}=A_{ik}B_{kj}

The only difference is that we don't bother to write the sigmas because we know that there's always a summation over those indices that appear twice. The summation convention is nothing more than that.

Here's another example of how to prove a matrix identity. Let's prove that Tr(ABC)=Tr(CAB).

\textrm{Tr}(ABC)=(ABC)_{ii}=A_{ij}B_{jk}C_{ki}=C_{ki}A_{ij}B_{jk}=\textrm{Tr}(CAB)

How about the identity (AB)T=BTAT?

((AB)^T)_{ij}=(AB)_{ji}=A_{jk}B_{ki}=B_{ki}A_{jk}=(B^T)_{ik}(A^T)_{kj}=(B^TA^T)_{ij}

See how easy things get with this notation?
I think I am starting to see it now...let me review this thread some more.

Thanks,
Casey
 
  • #10
sorry for posting this here, but how do i start a thread? I am new here and i need help on a math project. thanks :D
 
  • #11
susie44 said:
sorry for posting this here, but how do i start a thread? I am new here and i need help on a math project. thanks :D

In each forum, at the top, there's a 'new topic' button/link.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 23 ·
Replies
23
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
7K
  • · Replies 3 ·
Replies
3
Views
47K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
13K
  • · Replies 2 ·
Replies
2
Views
2K